Tackling information disorder calls for more than stricter regulation

This article was first published in:
Ahead of the Feb. 14, 2024 elections online civic space in the country has been increasingly polluted by misinformation and disinformation, which undermine electoral integrity. The banal exposure to misinformation spreading across digital platforms and in various forms, as well as the proliferation of its social influence, has resulted in a decline in public faith in mainstream media and democratic institutions. Amid this conundrum, the Working Committee (Panja) of House of Representatives Commission I and the government have finalized the second revision of the Information and Electronic Transactions (ITE) Law, after about six months of deliberation. A House plenary has been arranged reportedly this week to pass the amended law. While civil society has long advocated for an overhaul of the law due to its articles that are open to multiple interpretation and which have frequently been used to silence critics; obsolete provisions vis-à-vis current developments in digital technology and the internet; and the lack of a mandate for judicial scrutiny over the government’s decisions to request access takedown; the ongoing process only deals with several provisions and has been conducted mostly behind closed doors. This revision thus can be described as a missed opportunity and a potential threat to the democratic process leading to the elections. The ITE Law stipulates that spreading hoax/fake news is a crime that is punishable by a prison sentence of up to six years and/or a fine of up to Rp1 billion (US$64,700). However, the law does not clearly define what constitutes fake news, and how it differs from other types of information disorder, such as misinformation, disinformation and malinformation. Classifying the different types of information disorder as the same is problematic, if not misguided, because while both disinformation and malinformation are spread with malicious intention, misinformation refers to the spread of false information without the intention to mislead. The lack of clarity on which type of information disorder is punishable in the ITE Law risks widespread criminalization of members of the public given the country’s low information literacy. It is therefore important that the ITE Law considers intent and risks based on the level of real-world material harm caused by the spread of information disorder in defining sanctions against the agents of information disorder. Further, restrictions on free speech and free expression online must be legitimate, necessary and proportionate. Currently, Communications and Information Ministerial Regulation No. 5/2020, a derivative regulation of the ITE Law, stipulates that private electronic service providers (ESOs) must take down access to electronic information and documents deemed illegal in Indonesia within four to 24 hours of notice that the content exists on their platform. Failure to do so can result in the imposition of administrative sanctions, including temporary suspension of access to online platforms, general blocking of systems, fines and revocation of registration certificates. Heavy sanctions on internet intermediaries can have a significant chilling effect on freedom of expression. Existing intermediary regulations force private ESOs to proactively monitor content, which risks over-censorship of speech, including legitimate comment that appears misleading or false. While stricter regulatory approaches are aimed at faster removal of illegal and harmful online content and closing the accountability gap left by the lack of uniformity of transparency reporting by internet intermediaries, their collateral damage on freedom of expression is still unmitigated. This demonstrates that any regulatory intervention must be clear about its objectives and the harms it seeks to address. Rather than aimed narrowly at tackling illegal and harmful content, internet intermediary regulations must promote and enforce transparency, accountability and the protection of human rights online. This needs collaboration in which key stakeholders in the ecosystem have shared responsibility to strike a fine balance between addressing illegal content and respecting other social values, like freedom of expression and speech, diversity and innovation. In doing so, an effective regulatory framework needs to promote legal certainty for actors involved to address illegal content while accommodating emerging technology and improving transparency for platforms to share their practices on content moderation. The Digital Services Act (DSA) in the European Union, which sets responsibilities and sanctions for internet intermediaries proportionately, can be a good example for Indonesia. The DSA clearly defines transparency obligations and internal due-process obligations of internet intermediaries and assigns greater responsibilities for very large online platforms (VLOPs) and very large search engines (VLOSEs). It is complemented by the Digital Markets Act (DMA), which is set to enable healthy competition and innovation in the digital sector. Fighting disinformation demands bespoke solutions, not just blanket regulations. To mitigate potential harm against free speech, interventions should focus less on content regulation and more on process regulation. In this regard, the revision of the ITE Law needs to reformulate regulations around internet intermediary liability based on their business model and the services they offer and clearly define their administrative and technical obligations based on such categorizations of ESOs. Further, it is important to maintain the “safe harbor” provision in the ITE Law to ensure that internet intermediaries that have established preventive and mitigation measures against the spread of illegal and harmful third-party content cannot be held liable. Such conditional immunity from liability, however, needs to be accompanied by a provision of “notice and action” procedures that are proportionate. The Manila Principles on Intermediary Liability provide useful guidance on how the procedure should work. Concerns have risen from the global policy trends compelling internet intermediaries to compromise their security systems for easier content monitoring and surveillance to counter information disorders, illegal content and harmful content, through mandating backdoor encryption. The revision of the ITE Law must not bring about such an outcome. As legislation that regulates online speech, the provisions of the ITE Law determine the enjoyment of human rights online, the shape of our digital civic space and the quality of our democracy, both electoral and deliberative. However, the discussion of the second revision of the ITE Law has so far been largely closed to public participation. Such a process essentially violates the democratic principle of meaningful participation where the public should have the right to be heard, the right to receive information, the right to have their input considered, the right to receive an explanation and the right to file a complaint. The revision process of the ITE Law must not be hasty and should be open to the participation of all affected stakeholders, civil society, internet users and private ESOs alike. Strategic and structured involvement of the public in revising the law not only can mitigate risks of friction and resistance in the implementation of the law but also increases compliance and support. Deliberative and collaborative processes in developing internet regulations and co-regulation mechanisms have proven to be beneficial for governments in several countries. In the future, there needs to be more proactive and collaborative efforts such as the Safer Internet Lab (SAIL) from the Centre for Strategic and International Studies (CSIS) and Google Indonesia, which seek to find common ground between different stakeholder groups in handling information disorder. Civil society also needs to expand conversations on information disorder and its threat to democracy beyond the usual citizens who join these forums, and extend the opportunity for ordinary citizens to take part in public deliberation.


Pakarti Centre Building
Jl. Tanah Abang 3 No. 23-27
Jakarta, 10160

© 2024 · Safer Internet Lab