Update: Terrorist Content Analytics Platform – increased scope and new website 

We are continuing the development of the Terrorist Content Analytics Platform (TCAP) with the support of Public Safety Canada, and we are delighted to share two important updates:       

  • The initial version of the TCAP will host content from designated far-right organisations. This means that we are expanding the original scope from IS and al-Qaeda content. Read our full statement here
  • The TCAP now has its own website! You can check it out here! The TCAP website contains everything you need to know about the platform, and will going forward be the central hub for project updates. 

Are you interested in getting involved and supporting the world’s first centralised database of terrorist content? Join the TCAP Beta Programme and help us build a valuable and user friendly platform. As a Beta user, you would take part in the Beta launch of the TCAP and provide valuable feedback on the platforms through user interviews and testing new features. You can also contribute to the TCAP if you know of a terrorist content archive that you think should be included on the platform.       

If you wish to contribute, please send us an email at  [email protected] 

In other news

What’s up next? 

  • Our monthly e-learning webinars series continues with a webinar on “Accountability mechanisms for smaller tech platforms”. You can find out more and register here
  • Season 2 of the Tech Against Terrorism Podcast is still running! Whilst waiting for our next episode, on taking a gender perspective and exploring women’s role in terrorist and violent extremist online spaces, you can find the previous episodes here.  

    Don’t forget to follow us on Twitter to be the first to know when a webinar is announced, or a podcast is released!  

Tech Against Terrorism Reader’s Digest – 3 July

Our weekly review of articles on terrorist and violent extremist use of the internet, counterterrorism, digital rights, and tech policy. 

Tech Policy

  • The Facebook Oversight Board: Creating an Independent Institution to Adjudicate Online Free Expression: In this article Kate Klonick documents the history of the creation of the Facebook Oversight Board. Klonick, a professor who specialises in law and technology,  has spent time embedded with the Facebook team in charge of designing the Oversight Board. She argues that the Oversight Board represents a “novel articulation of internet governance,” and takes an optimistic view of the Board, concluding that it “has great potential to set new precedent for user participation in private platforms’ governance and a user right to procedure in content moderation.” (Klonick, Yale Law Journal, June 2020)  
  • Discord Was Once The Alt-Right’s Favorite Chat App. Now It’s Gone Mainstream And Scored A New $3.5 Billion Valuation: Forbes’ Adam Brown traces Discord’s evolution from its founding in 2015 and becoming a popular hub of the alt-right in 2017, to its recent $3.5 billion valuation after successfully removing several white supremacist groups from its platform. Discord is now used to organise Black Lives Matter protests, book clubs and educational sessions. In 2017, white nationalists flooded Discord and used it as an organising tool for the infamous Charlottesville Neo-Nazi rally. Discord has since increased its moderation efforts to tackle use of its platform by far-right violent extremists. Its Trust & Safety team – which comprises 15% of Discord’s overall workforce – specifically focuses on researching white nationalist groups on its platform. In sum, the article shows how Discord was able to improve its counterextremism efforts and thereby capture interest from investors and broadening its appeal in the process. (Brown, Forbes, 30.06.2020)

On this topic, we also listened to 

 Trust & Safety has a posse, with Adelin Clain and Clara Tsao co-founders of the Trust & Safety Foundation (TechDirt Podcast, 30.06.2020)  

Far-right violent extremism and terrorism

  • Banning Nazis or “burning books”? How big tech is responding to hate speech, and the implications: In this GNET Insights piece, Florence Keen looks at the recent deplatforming of far-right extremist and violent extremist users that took place across multiple online platforms this past week – including on YouTube, Discord, Reddit, and Facebook. Based on these events, Keen looks at how larger tech companies have historically responded to hate speech and what implications such responses might have. Deplatforming has been contested as a solution, since some say it simply risks pushing violent extremists to other more fringe platforms, with others arguing that such moves are still preferable in order to limit audience reach. Here, Keen agrees with the latter, concluding that this debate is not only about freedom of speech but also about public safety and preventing genuine harm. (Keen, GNET, 02.07.2020)
  • YouTube bans Stefan Molyneux, David Duke, Richard Spencer, and more for hate speechYouTube has banned several prominent white supremacist channels for hate speech, including those belonging to Stefan Molyneux, David Duke, and Richard Spencer. YouTube said that these accounts violated the platform’s policies by “alleging that members of protected groups were inferior.” YouTube’s crackdown on white supremacist channels began in July 2019, when the video-hosting service updated its community guidelines, with violators being sanctioned through different penalties, including removal of their channels. (Alexander, The Verge, 29.06.2020)
  • In France, YouTube has also removed the channel of entertainer Dieudonné for violating its community guidelines. Dieudonné’s videos regularly targeted Jewish people, and anti-Semitic rhetoric was also present in the comment section below his videos. Active since 2015, his channel has seen a drastic increase in views since February 2020, increasing by 200% in May 2020.
  • Facebook removes hundreds of boogaloo accounts for ‘promoting violence’ in coordinated takedown: Facebook announced on 30 June that it was labelling the faction of the Boogaloo movement advocating for violence a “dangerous organization”, and has taken down 220 Facebook accounts, 28 pages, 106 groups, and 95 Instagram accounts associated with the movement. Facebook also took down a further 400 accounts and an additional 100 pages that praised the Boogaloo movement.  Facebook noted that these actions  do not mean all Boogaloo content is prohibited on its site — but any posts or symbols that are tied to or support the violent Boogaloo network will be banned. (Rachel Lerman, Washington Post, (30.06.2020)

Islamist terrorism

  • Pandemic Narratives: Pro-Islamic State Media and the CoronavirusChelsea Daymon and Meili Criezis examine unofficial pro-Islamic State media responses to the coronavirus pandemic. Collecting data from Telegram, Twitter, and Rocket.Chat, they unify the responses into 11 major themes: number of cases; conspiracy theories; defeating boredom; divine punishment; humour; practical responses; religious support and resources; IS coronavirus news; socioeconomic decay. Overall, these themes indicate that pro-Islamic State networks have attempted to reinforce camaraderie and community engagement in a time of social isolation. (Daymon and Criezis,Combating Terrorism Center at West Point, June 2020)

For any questions or media requests, please get in touch via:
[email protected]

Background to Tech Against Terrorism

Tech Against Terrorism is an initiative supporting the global technology sector in responding to terrorist use of the internet whilst respecting human rights, and we work to promote public-private partnerships to mitigate this threat. Our research shows that terrorist groups – both jihadist and far-right terrorists – consistently exploit smaller tech platforms when disseminating propaganda. At Tech Against Terrorism, our mission is to support smaller tech companies in tackling this threat whilst respecting human rights and to provide companies with practical tools to facilitate this process. As a public-private partnership, the initiative works with the United Nations Counter Terrorism Executive Directorate (UN CTED) and has been supported by the Global Internet Forum to Counter Terrorism (GIFCT) and the governments of Spain, Switzerland, the Republic of Korea, and Canada.