Co-authored by Temiloluwa Alalade

During election season, voters are inundated with campaign materials intended to sway their votes. To engage with information meaningfully voters should have a basic understanding of digital and media literacy as well as critical thinking skills. Experts in Kenya warn that disinformation in elections may increase electoral violence globally.

Electionrelated misinformation and disinformation seeks to, primarily, manipulate the decision-making of electorates, cast doubt in the electoral process, and delegitimize the outcome of the elections. This is a dangerous trend, particularly in fragile democracies where this is capable of inciting hate and stirring violent outbreak. Misinformation and disinformation were major factors in the escalation of post-election violence in Kenya following the 2017 general elections. Similarly, in 2020, the Central African Republic experienced deadly post-election violence as a result of a contested election, targeted disinformation efforts, and divisive language. The same happened in Cote d’Ivoire, where 50 people were killed in the political and intercommunal violence that plagued the presidential election on October 31, 2020.

The forthcoming elections in Kenya and Nigeria already present a media space including social media rife with disinformation. According to a recent study by Mozilla, there were 300 accounts uploading hate speech and disinformation videos on the Kenyan election on TikTok alone. Even more alarming is that the same messaging that was used in 2007 is still being used in 2022. The investigation also showed that over four million users on the network had watched the videos from the 300 accounts. Given that TikTok videos frequently spread to other platforms, the actual number of views may be significantly more.

Confronting disinformation during elections

Understanding the nature of mis/disinformation and how to effectively confront it in the context of elections is necessary for reducing vulnerability to it. Simply put, misinformation is false, inaccurate or misleading information regardless of the intention to deceive. On the other hand disinformation is the deliberate creation, distribution and amplification of false, inaccurate or misleading information with the intent to deceive. In elections, propaganda is used as a tool to manipulate the voting population by affecting their beliefs, attitudes, or preferences in order to influence their decision. To curb the participation of women and gender diverse people in politics, gendered disinformation is used as a tactic. Gendered disinformation is the spread of deceptive or inaccurate information and images that are gendered. While gendered disinformation could target anyone, women and gender diverse people are disproportionately impacted.  A global study by Plan International reveals that when gendered disinformation is left unchecked in elections, it distorts the public understanding of the track record of women in politics, discourages women from participating in politics and governance, and can also lead to online and offline violence against women.

“Simply put, misinformation is false, inaccurate or misleading information regardless of the intention to deceive. On the other hand disinformation is the deliberate creation, distribution and amplification of false, inaccurate or misleading information with the intent to deceive.”

Election season is typically a period of heightened tension and volatility. Different players scramble to control the narrative. Emotionally charged narratives are deployed to sway public opinion and persuade the decisions of the electorates. Due to the peculiarity of elections, and the ubiquity of social media, mis/disinformation often festers unchecked leaving people confused, divided and doubtful of the electoral process. Countering the threat of mis/disinformation in elections requires an understanding of the ways in which  mis/disinformation manifests in elections.

Mis/disinformation tactics in during elections

  • Voter suppression content

The content might be targeted at voter suppression or disenfranchisement of a targeted group of people. This might include misrepresenting the date and time of the election, portraying a threat of violence within a particular region, using health and safety scares to dissuade voter turnout, misrepresenting the facts around voter registration or voter cards to misinform. For example, in countries where permanent voter’s cards are issued, false news may circulate in specific regions that the cards expire and therefore are not valid for use. This is done to cause confusion and reduce voter turnout. In other instances, the electoral bodies might have a continuous voter registration programme that allows people to register to get a voter’s card throughout the year. However, to reduce registration of certain groups, false information might spread that the electoral body has closed the portal for registration. These are some examples of voter suppression tactics that are used by disinformation agents to limit participation in the electoral process. 

  • Us versus Them content

This is a form of content that preys on existing divisions in the society and exploits it to drive extreme political ideologies. Oftentimes, harassment, hate and violence are the vehicles through which this form of content travels. It is a method highly favoured by populist and illiberal leaders who exploit the differences in society to grab power. This form of content further polarises the polity, drives electorates into their echo chambers and closes out room for constructive political debates across the divide.

  • Delegitimizing content

This form of content targets the entire electoral process including the structures, equipment and personnel. This is aimed at disrupting confidence in the electoral process and diminishing the legitimacy of the election outcome. This could range from allegations of mal-handling of voter devices, ballot papers and boxes, and other sensitive materials. It could also include the deligitmatisation of candidates, parties and unsubstantiated claims of foreign interference. However, this should not discount legitimate critique or concerns over any aspect of the electoral process.

  • Unverified declaration of victory

Unverified claims of victory are common practices aimed at undermining the outcome of elections. It often includes inaccurate or falsified election results which are usually circulated before the electoral process is concluded and is typical of closely contested elections. This often results in post election violence in fragile democracies.

Media manipulation and mis/disinformation in elections

The advancement of technology developed to mislead makes it increasingly difficult to verify content. Furthermore, the emergence of new forms of technology that enables easier manipulation of audio and video content has had a dangerous impact on the integrity of elections. 

  • Deepfakes

Deepfakes are new forms of audiovisual manipulation that allow people to create realistic simulations of someone’s face, voice or actions. They enable users of deepfake applications to make it seem like someone said or did something they didn’t. They are getting easier to make, requiring fewer source images to build them, and are increasingly being commercialised. The convincing nature of deepfakes makes it increasingly difficult to reliably detect manipulated media. While there has been a greater prevalence of deepfakes in the Global North, cases of deepfakes being created and shared are starting to surface on the African continent. This is challenging because access to detection tools and the expertise necessary to utilise the tools are not widespread on the continent. 

Deepfakes are capable of being integrated into existing conspiracy theories and disinformation campaigns. This makes it possible for disinformation players in elections to frame the opposition or officials of the electoral umpire in a bad light capable of undermining the elections. 

Deepfakes can also be misused to gaslight the opposition using humour as an excuse to evade accountability and mask malicious intent. The most commonly applied deepfake technology is the use of artificial intelligence to digitally insert an individual’s image into sexual videos and photos (deepnudes) without their consent. Women, unfortunately make about 90% or more of deepfake targets. The technology needed to make non-consensual sexual deepfakes is now more easibly accessible –  in most cases requiring only a simple digital device. The ease of distributing deepnudes online presents a dangerous threat and is even more difficult to take down when shared on encrypted messaging platforms like WhatsApp. Given the propensity of deepnudes to disproportionately impact women, they directly undermine women’s participation in politics.

  • Shallowfakes

Like the name implies, it is media manipulation that does not require the use of AI or complex technology. Rather, simple tricks like mislabeling content, reposting old captions with a new one, making slight alterations like cropping out certain parts of an image, or editing a video. It could also include speeding up or slowing down a video. For example, the speech in this shallowfake video has been slowed down in order to portray President Uhuru Kenyatta as an alcoholic.

Shallowfakes don’t require much effort to produce, and they take very little time to achieve. Though crudely produced, they can be devastatingly effective at misleading people and often lead to offline violence. They are the most commonly used form of media manipulation in Africa. There are many ways in which shallowfakes can be presented to mislead the electorates during an election.

In the 2021 Ugandan presidential polls, a manipulated image of Ugandan presidential candidate – Robert Ssentamu (Bobi Wine)  was circulated, showing him in front of a lavish home in San Francisco. This was done to discredit  the opposition and to present him as a fraud who is pretending to identify with the masses while maintaining a lavish lifestyle abroad.

  • Misleading captions

A misleading caption or one that lies by omission deliberately leaves out certain pieces of the information in order to distort the facts. For example, videos suggesting that mail-in voting, ballot drop-boxes, vote-counting, fixing damaged ballots, cleaning ballot before processing, and other standard practices during an election are indications of irregularities in the election. 

Misleading captions can also be used to create narratives in the minds of electorates. In Kenya, a genuine interview granted by Rigathi Gachagua (a running mate of a main presidential contestant) to a local radio station was posted online with a subtitled English translation which read: “We will kill it [Safaricom] and give that money to the people as handouts.” However, what he actually said was; “Instead of having one large company called Safaricom paying taxes, if you take the money from the large company and give it out to many people… the tax from the many put together will be 30 times greater than that of the large company.” Mr. Gachagua‘s statement proposed the redistribution of profits to support smaller enterprises and clearly was not a statement to “kill” corporate business.

  • Recycled media

These are media content re-labeled and reposted claiming that an event that happened in one place has just happened in another. For example, a video of a ballot snatching in one location can be reused and reattributed to players in Nigeria, South Africa, Kenya, and Zimbabwe. This is capable of not just undermining the credibility of the electoral process but repeatedly inciting violence in the many locations that it is reposted.

During the 2019 general elections in Nigeria, an eight-year-old CNN report, filmed in the lead up to the 2011 elections and containing footage of Nigerian security forces seizing smuggled weapons, was shared on WhatsApp as if it was a current story. Videos such as these tend to reappear during election season with the aim of creating fear of violence among electorates which could lead to voter suppression. 

  • Mis-dated/mis-placed media

These are media, oftentimes videos and images, of events that happen in one place or period but are labelled to have happened in another location or time. This directly hinders the ability to ascertain the origin of information and trace the truth.

For example, in this viral tweet made by Nigerian presidential aide Lauretta Onochie, it was alluded that the opposition presidential candidate Atiku Abubakar had shared food and money to voters during one of his campaign rallies in the 2019 Nigerian general elections. However, it was discovered that the picture was not taken at an Atiku campaign rally, but was originally posted in 2017 by Kokun Foundation, a Lagos-based charity organisation.

To effectively confront disinformation in elections all stakeholders must play their part. The government, social media platforms, the media and the electoral umpire must fulfil their commitment to the people by ensuring a healthy information ecosystem that would promote civic debates and enable voters to engage with information in a meaningful way. Technology platforms in particular have a responsibility to ensure that their policies are consistently applied in elections outside of the global north. Social media platforms must effectively moderate disinformation in a way that limits its reach as well as elevate fact based information.

On the other hand, people must also take agency in confronting disinformation in their context.  Ask yourself: Could the information source be a key player in the election? Do they have a vested interest or political ties? Are they a foreign influence who may have something at stake in the electoral process? Or do they stand to get financial benefit from spreading disinformation? These questions make up the first step in maintaining a critical mind which is necessary in mitigating the persuasion of disinformation in elections.

Useful resources:

Published 29th July 2022.

Leave a Reply

Your email address will not be published. Required fields are marked *