WITNESS and 14 other organizations have joined together to send a public letter to the newly appointed Global Internet Forum to Counter Terrorism (GIFCT) Executive Director Nicholas Rasmussen to apprise him of threats to human rights posed by GIFCT. Many of us have been warning companies and lawmakers about these issues for years. WITNESS is prepared to defend those rights as GIFCT reshapes itself in a milieu rife with political pressure and technological solutionism, which lacks shared definitions of terrorism or violent extremism. We are publishing the full text of our letter today, which includes an earlier letter sent to GIFCT founding companies in February.
The February letter outlined many (though not all) of civil society’s key concerns around GIFCT. It points to a serious risk of unlawful censorship from government involvement in GIFCT; lack of genuine and balanced engagement with civil society; lack of clarity over the terms “terrorism,” “violent extremism,” “extremism,” and support for or incitement to them; increasing scope and use of a shared hash database without either transparency or remedy for improper removals; and persistent lack of transparency around GIFCT activity. The letter to Mr. Rasmussen also emphasizes the need to not repeat human rights abuses that have been committed in the name of countering terrorism, as well as the necessity of encryption for human rights defenders.
These letters come at a time when GIFCT is changing and growing. GIFCT has existed for several years, but it’s now more important than ever. In December 2016 Facebook announced that it would join Microsoft, Twitter, and YouTube to create a “shared industry database of ‘hashes’—unique digital ‘fingerprints’—for violent terrorist imagery or terrorist recruitment videos or images that we have removed from our services.” In July of 2017 these companies officially formed the Global Internet Forum to Counter Terrorism with the goal of “disrupting terrorist abuse of members’ digital platforms.” They never performed a public human rights impact assessment and had little-to-no communication with human rights NGOs about the creation and implementation of these new tools.
In the years since, companies have stepped up their use of automated content removal despite being alerted to ongoing mass takedowns of human rights content, including documentation of war crimes. Efforts such as the European regulation on “Dissemination of terrorist content online” have also encouraged increased use of automated removal at the cost of human rights.
The March 2019 Christchurch attacks, in which a right-wing extremist gunman killed 51 Muslims during prayer and livestreamed much of the attack, marked a turning point. The attacks painfully highlighted the online component of offline violence, but it also highlighted the continued failure of law enforcement to protect communities that have been the target of right-wing violence: Muslim groups had warned police of threats weeks before Christchurch. Lawmakers focused on the livestreaming aspect, and introduced the “Christchurch Call to Eliminate Terrorist and Violent Extremist Content Online” in May 2019.
As its name shows, the Christchurch Call focused on removing content, rather than improving other methods of fighting violent hate-motivated attacks. Since the Call focused on content moderation, GIFCT became a central part of these efforts, leading to the reorganization and building up of GIFCT. One of the sad ironies about this is that until recently, the GIFCT database consisted exclusively of content from “Islamist terrorist” groups. In fact, platforms are now increasingly removing right-wing extremist content – though not in the GIFCT framework.
In September of 2019 GIFCT announced that it would become an NGO. Mr. Rasmussen is that NGO’s inaugural executive director. Along with the creation of an NGO, GIFCT has created an Independent Advisory Committee and working groups to address specific issues such as transparency. Unfortunately, as we note in our letter, “We chose not to apply to participate in the IAC, because the concerns we have raised have not yet been sufficiently addressed.”
Our comments come at a key time for GIFCT, as its activities become more visible. Last week, it held the first multi-stakeholder forum since its reorganization. GIFCT’s new working groups have also had their initial meetings in recent weeks, and it is clear that GIFCT will play a big role going forward. GIFCT also published its second transparency report last week, which replaced the previous transparency report on its website.
So what happens next? The new transparency report does include more information than the last one, and it is refreshing to see the recording of the multi-stakeholder forum made public, but that’s not enough. As we note in our letter to Mr. Rasmussen:
In a troubling trend, policy makers in government and at technology companies are increasingly treating content moderation as the tool of choice for counter-terrorism work. GIFCT is also engaging with law enforcement and experts in challenging violent extremism and counter-terrorism without transparency or any real assessment of the potential human rights harms this could cause. Counter-terrorism programs and surveillance have violated the rights of Muslims, Arabs, and other groups around the world, and have been used by governments to silence civil society.
The coming months will determine whether GIFCT will become yet another multi-stakeholder forum where human rights experts are brought in as window-dressing while government and companies work closely together, or whether it will truly be able to provide an avenue for human rights advocates to ensure that basic human rights are respected in the mad rush to “eliminate” poorly defined “terrorist and violent extremist content.” WITNESS will be watching, and we’re not alone.
Read our full letter here (link to PDF).
30 July 2020
This work is licensed under a Creative Commons Attribution-NoDerivatives 4.0 International License.