Reader’s Digest – 11 June 2021

Our weekly review of articles on terrorist and violent extremist use of the internet, counterterrorism, digital rights, and tech policy.

Webinar Alert

  • Our next TAT & GIFCT E-learning Webinar Series, “APAC in Focus: Regional Responses to Terrorist and Violent Extremist Activity Online”, will take place on Thursday, 24 June, 3 pm BST. Stay tuned for the agenda announcement!

Tech Against Terrorism Updates

  • We are excited to announce that an updated version of the Knowledge Sharing Platform (KSP) will be re-launched to tech platforms soon. The KSP is a collection of interactive tools and resources designed to support the operational needs of smaller tech platforms. The KSP is a “one stop shop” for companies to access practical resources to support their counterterrorism and transparency efforts. It is a free platform which contains research and guidelines on topics including: policies and content standards, terrorist & violent extremist use of the internet, proscribed groups, online regulation, and transparency reporting. Stay tuned for further announcements about the launch date!

Terrorist Content Analytics Platform (TCAP)

  • Last week (31.05-06.06), the TCAP identified and verified 269 URLs containing terrorist content, sent 171 alerts to 26 tech companies, and 78% of this content is now offline.

Top Stories

  • Nigeria’s government has banned Twitter on 4 June, following Twitter moderating a post from Nigeria’s President Muhammadu Buhari in which he threatened to punish secessionists. Users face prosecution if they continue to tweet.

    Local rights groups have filed a lawsuit at a regional court to lift the ban, arguing the Twitter suspension is an attempt to censor criticism against the government. 
  • President Biden has revoked some of the sanctions against TikTok, WeChat, and other Chinese tech companies that were put forward by former President Trump.
  • Ireland’s Human Rights and Equality Commissioner has called for the Irish government to do more when it comes to tackling hate speech and racism online.  
  • Italian counterterrorism police has stated that they dismantled a far-right, Neo-Nazi extremist group, called Roman Aryan Order, which was planning to bomb a NATO base.
  • The Electronic Frontier Foundation has welcomed the change in Facebook’s content moderation policy to hold “influential users”, including politicians, to the same rules as other users. They caution, however, that Facebook has more progress to make and should uphold the Santa Clara Principles that focus on more accountability and transparency by platforms.
  • Four Muslim family members were killed in Canada this week, which the Canadian authorities are viewing as a far-right terrorist attack.  

Tech Policy

  • WhatsApp blocks accounts of at least seven Gaza Strip journalists: This article by Reporters Without Borders (RSF), discusses WhatsApp’s banning of Palestinian journalists and advocates for the tech company to be more transparent about its moderation and appeal processes. The organisation highlights how WhatsApp banned the accounts of 7 journalists in the Gaza Strip following the recent ceasefire. RSF argues that WhatsApp’s algorithms to identify illegal behaviour “are flawed” and “that effectively censors” journalists, as they need to access violent content that may come from violent groups to report on the conflict. RSF contacted WhatsApp, and the tech company replied that by following their policies and relevant legislation, a number of journalists were “inadvertently impacted” and that they “worked quickly to reinstate their accounts”, referring to the accounts to the journalists. RSF concludes transparency on WhatsApp’s content removal procedures and algorithms is required, as well as on their appeal process. (Reporters Without Borders, 07.06.21).
  • Amplification and Discontents: Platform regulation expert Daphne Keller examines different models to regulate content amplification, something which has increasingly been discussed as a regulatory option in the United States and in Europe. Keller sees three possible models, the first in which platforms are legally obliged to prevent amplification of illegal content, the second in which they are required to prevent amplification of “harmful but lawful” speech, and a third where platforms will be held liable for amplifying any content at all. Keller sees several problems with each approach, for example around users’ rights, platforms’ rights, and privacy, and warns that – like removal obligations – anti-amplification provisions could lead to companies erring on the side of caution and demote legitimate content, which would have negative freedom of speech consequences. Finally, Keller stresses how in the US the various approaches will inevitably run into obstacles in terms of compliance with the US First Amendment. (Keller, Knight First Amendment Institute at Columbia University, 08.06.2021)

    Earlier this year, Tech Against Terrorism published a position paper on content personalisation and terrorist use of the internet. Read it here.

    Last year, Daphne was a guest on the Tech Against Terrorism podcast. You can listen to the episode here.


  • Gendering the boy child in the context of counterterrorism: the situation of boys in Northeast Syria: Fionnuala Ní Aoláin, the U.N. Special Rapporteur on the Promotion and Protection of Human Rights and Fundamental Freedoms while Countering Terrorism, uses a gender approach to analyse the situation of young boys and male adolescents detained in detention camps in northeast Syria. She argues that these boys and adolescents are vulnerable to unique harms due to their male gender and the inherent assumptions that come with this in a counter-terrorism context, their presumed association with the Islamic State (IS), the geography, and their religious beliefs. She elaborates that due to them being boys, the rhetoric of protection that is predominant when talking about female children or adolescents is lacking, leading to severe human rights abuses. She highlights how first and foremost, these boys should be seen as victims of terrorism with subsequent consequences for their treatment according to the Convention on the Rights of the Child. She concludes that we need to avoid gendered narratives and tropes that undervalue the harms that these boys are exposed to. (Ní Aoláin, Just Security, 8.06.21).

For any questions, please get in touch via:
[email protected]