As political tension heats up ahead of 2024, Indonesia needs to dismantle disinformation

This article was first published in:

The political climate steadily heats up en route to the 2024 general election. As we have observed in the previous election cycles, the proliferation of disinformation on social media may fuel detrimental political discourses. As the rules and dynamics of social media evolve every second, so does the pattern of disinformation campaigns.

Both social media usage and disinformation tactics show rising trends. The 2022 Young Voters survey report from the Jakarta-based Centre for Strategic and International Studies (CSIS) expounded that social media progressively becomes the most preferred medium for political expression and the primary source of information for the youth, who will also be the majority of voters in next year’s election (54 percent).

Similarly, the 2020-2021 infodemics show that information disorder often rides the waves of public conversation. The tide is rising again in the upcoming 2024 election, as demonstrated by the hoax statistics from the Communications and Information Ministry. The increasing centrality of social media may enable disinformation to create harsher effects in the forthcoming election.

At least five lingering issues need addressing. First, false information now comes in more complex forms. The Indonesian Anti-Slander Society (Mafindo) data revealed that in 2020, around one-third (38 percent) of the reported hoaxes came in a mixed format of the text, audio, video, photos and images. In 2021, the number increased to almost two-thirds (64 percent), along with the surging popularity of short video-sharing platforms. A complex format entails more intricate debunking steps, providing novel challenges for mitigative efforts.

Second, steps to mitigate disinformation are aplenty, but endeavours to prevent it from spreading are fewer. Emulating the supply chain, there are upstream and downstream elements of information disorder. The upstream deals with production, while the downstream deals with dissemination.

Steps taken downstream are primarily designed to prepare the audience, curb the spread or respond with facts. For example, the Digital Literacy Program improves people’s information literacy. Messaging platforms’ forwarding limits impede false messages’ virality. Fact-checking initiatives and “hoax-buster” tools by journalists and civil societies provide factual counternarratives.

These interventions are all necessary and have yielded results. However, relying on mitigative or debunking steps alone may only counteract so far. After all, most Indonesians (68.4 percent) are still unsure of their ability to identify hoaxes. Preventive measures need to be the first layer of defense.

Third, the existing pre-bunking measures focus on users as potential disinformation recipients rather than discouraging the producers. Pre-bunking techniques are meant to inoculate users against inbound disinformation so that they can recognize attempts to distort facts and make their own decision on which information is accurate.

Currently, pre-bunking training is mainly conducted by journalists’ associations and civil societies through workshops and educational materials. The Elections Supervisory Agency and the Communications and Information Ministry have also started to pick up steam.

Pre-bunking may have a wider reach than debunking methods because it empowers users to verify information rather than rely on fact-checking efforts. However, users can only be held responsible for the disinformation they have helped spread. The immense responsibility should fall to the actors that create it in the first place.

Fourth, in targeting the source, there is a challenge of attribution: identifying actors responsible for producing and spreading disinformation. Sastramidjaja, Berenschot, Wijayanto and Fahmi (2021) have investigated the works of cyber-troops who orchestrate online campaigns by manipulating information to manufacture public consensus on certain candidates or policies. In response to these troops, social media platforms have improved their ability to “name and shame” the perpetrators by enhancing their content algorithm.

However, as means of identification evolve, so do the operations. Many disinformation campaigns now work under the radar, away from popular social media platforms to end-to-end encrypted messaging platforms. These troops’ fluid and adaptive nature make it difficult to understand the absolute magnitude of disinformation production at any given time, including pinpointing the actors.

Fifth, lack of transparency. Identifying the instigator might be easier than going after its employers, especially when state or corporate resources are involved. Economic and political elites often outsource the disinformation campaign to a third party to borrow deniability.

This deniability has enabled the funders to distance themselves from the General Elections Commission (KPU) regulation No. 23/2018, for example. The law limits the number of official social media accounts affiliated with a candidate or a political party to ten accounts per platform. And yet, the works of cyber troops know no campaign schedule and account limits.

Addressing disinformation at the point of infection requires us to trail into uncharted territory. We need a collective research effort that fills these gaps. One that aims to catch up with the evermoving target of computational propaganda, empowers stakeholders to create adaptive regulations and explores cutting-edge tools and means to prevent manipulative information.

The endgame is to build coherent, multiple layers of intervention aiming at dissuading the producers, while also making fact-checking efforts sustainable, building sensible content moderation on platforms and improving users’ information literacy.

Finally, disinformation does bring democracies into a dilemma. Letting it run amok or managing it with the iron hand of censorship is a recipe for democratic backsliding. We need to do it in such a way that freedom of expression and online safety can both be promoted.

Author

Pakarti Centre Building
Jl. Tanah Abang 3 No. 23-27
Jakarta, 10160

© 2024 · Safer Internet Lab