—December 23, 2019

In late November 2019, WITNESS led a workshop on deepfakes at the University of Pretoria, South Africa. Building on the success of a workshop in Brazil in July, and with another workshop in Malaysia scheduled for 2020, the event series has been a unique effort to engage academics, journalists, researchers and human rights activists from the Global South in the process of responding to deepfakes.

In many ways it shouldn’t be so unique: There’s nothing about disinformation or synthetic media that makes it solely an American problem, for example. Where social media platforms like Twitter or Facebook are concerned, the bulk of users are now outside of the US; we know that WhatsApp plays a huge role in Indian politics and YouTube has had a huge influence on Brazil, to pick two country-specific examples. Yet the social platforms, almost all of which are based in Silicon Valley, tend to give precedence to the needs of American users, American legislators and American investors, with European bodies also having an outsize voice.

That’s why we’re trying to de-center Euro-American perspectives in our research on synthetic media, hosting events around the world to source a diversity of voices that will inform WITNESS’ advocacy work with tech platforms.

Activists review threat scenarios at the workshop
Activists review threat scenarios at the workshop

Highlighting media literacy need

Our Pretoria workshop was divided into two sections: The first half of the day was explanatory, with presentations from WITNESS staff, fact-checking organization Media Monitoring Africa and deepfake detection expert Francesco Marra of the University of Napoli. Having built a consensus around the nature of deepfakes, the second half of the day involved threat modeling and prioritization exercises — the results of which are outlined in a previous blog post — and a ranking of potential solutions that the participants believed would help.

We split the room into three groups for a discussion of solutions. One group would focus on the response from social platforms, another would cover detection and authentication techniques, and the last would look at media literacy initiatives. When the time came for participants to join a group, we were surprised by the outcome: Rather than an even split, more than half of all the attendees chose to discuss media literacy, with each of the other groups attracting only a handful of people.

A common line of argument emerged: With a new level of understanding of synthetic media, activists recognized that it would be hard to convey sophisticated media manipulations to the groups they worked with while those groups were susceptible to more simplistic disinformation campaigns. As a result, improving media literacy at a more fundamental level would be a precursor to addressing the challenges presented by deepfakes.

Steps to building practical literacy

Generally, approaches to teaching media literacy focus on both how media types create certain effects and why. This second part, the why, encompasses the function of media in society: building support or opposition for certain policies, commenting on cultural phenomena, codifying social norms, casting different groups in a positive or negative light, and so on. 

Analyzing the why of media also involves understanding the financial and political interests that shape the media (including social media), and who stands to gain from certain types of story achieving prominence. When a news consumer understands the motivation for promoting a certain narrative, they are less likely to passively accept it. And importantly, as work on inoculation against misinformation has shown, being exposed to false narratives in advance makes people less likely to be taken in by them.

During our workshop discussion, there was no single suggestion that promised to fix the problem of low media literacy. Rather, there was agreement on some of the key elements of a successful media literacy program, which still managed to capture important cultural specifics that could otherwise be overlooked.

1. Localize content

South Africa is a nation with 10 official languages and dozens more that are widely spoken; many other countries in the region have a similar linguistic diversity. For maximum impact, materials surrounding deepfakes and misinformation should be widely translated into local languages, making the content feel relevant for each community.

As always, translation takes time and resources — so this is one clear output that media literacy funding could target, according to what we heard.

2. Use trusted channels

At one point in the discussion we asked, “What media channels are trusted in your community?”

Many of the respondents suggested that radio and local newspapers had high levels of trust, more so than national news outlets. As a result, media literacy projects could start by placing messages in these local channels and linking them with national media narratives — ensuring that the campaigns start from a grassroots base and build upwards.

3. Work within social structures

Any given community has established social structures and figures whose opinions carry weight. These can be traditional elders, local politicians, religious leaders, celebrities, online influencers, and more.

To successfully promote messages around media literacy, participants thought that identifying and working with these figures of influence was key and would lead to better outcomes than trying to gain traction without engaging them.

4. Incorporate into education

When it comes to media, it’s easier to teach critical thinking skills from a young age than try to change thought patterns later in life. Our workshop participants suggested that media literacy could be better integrated into the school curriculum and more highly prioritized, giving younger generations the skills they will need to navigate the digital news landscape.

Conclusion: Building habits

New skills really start to influence our behavior at the point they become habits.

“Here in South Africa, lots of people install call screening apps to get rid of nuisance calls,” said a local activist during our discussion. “They don’t have to be technical, it helps them so they get into the habit of using it. Couldn’t it be the same thing for fake video? We need to work out how to get people to be more critical as a habit.”

There’s a lot of value in considering media literacy as a habit, and strategizing around ways we might nudge people toward behaviors that would slow the spread of misinformation. It might be something akin to call screening, a technological solution where an app gives additional context to the media presented on our phones. But it could also be something simpler — some kind of mental checklist to run through before hitting the “share” button.

Either way, the activists WITNESS encountered in South Africa gave many useful insights that will inform our work in the future. We’ll keep bringing you these insights here on the blog, and we hope that you’ll support our work by sharing it widely.

Leave a Reply

Your email address will not be published. Required fields are marked *