Headline news

  • The Tech Against Terrorism Podcast made its return last month, and our episode on “Gender approaches on women’s role in online extremist sphere” is now live on most podcast streaming platforms! This podcast, featuring Dr. Joana Cook and Dr. Elisabeth Pearson, discusses the benefits of using gender approaches in understanding women’s role in online extremist spheres and reflects on how this affects the tech sector’s role in tackling terrorism online.

  • We are also back with the Tech Against Terrorism and GIFCT E-learning webinars! In July, we hosted a webinar on accountability mechanisms for tech platforms. Speakers included Jillian C. York of the Electronic Frontier Foundation, Dina Hussein from Facebook, Sean Li from Discord, and Zoe Darme of Microsoft who provided insights into the importance of accountability for tech platforms and the challenges such mechanisms entail. You can read a summary of the webinar here.

  • If you wish to access the recording of this webinar, or previous ones, please send us an email at [email protected]
     
  • The 2020 GIFCT Multi-Stakeholder Summit took place on 23 July under the leadership of newly appointed executive director Nicholas Rasmussen. This is the fourth GIFCT summit, the first one having been organised by Tech Against Terrorism in 2017, and the first summit for the GIFCT as an independent organisation. The summit saw presentations from a range of stakeholders – including Tech Against Terrorism – recordings of which you can find here. During the course of the event, the different working groups were also introduced. Tech Against Terrorism is delighted to chair the working group on technical approaches, which you can read more about here.

Tech Against Terrorism in the media:

  • Tech Against Terrorism’s Director Adam Hadley, in an op-ed in the Independent, cautioned that asking big platforms to take down extremist content more quickly is not a solution to either terrorism or terrorist use of the internet.

  • The most effective way to prevent terrorists’ exploitation of smaller social media companies, while ensuring online competition thrives, is to provide them with the practical tools to make their platforms more secure. Tech Against Terrorism mentors smaller platforms to help them deal with the threat and spot and remove content quickly and efficiently.”

    Furthermore, governments need to work with the global tech sector, including around designating terrorist and violent extremist actors to provide more definitional clarity around terrorism and anchor online content moderation efforts in the rule of law.
     
  • Earlier in July, Tech Against Terrorism was quoted in the Telegraph on the rise of “alt-tech” platforms as a result of mainstream social media platform removing extremist and terrorist content from their platforms.

  • “Director at Tech Against Terrorism – a UN supported initiative that works with global tech firms – Adam Hadley said the ‘vast majority’ of activity by terrorists and extremists took place on smaller messaging apps and content storage platforms.                    
    ‘These smaller platforms may not have the capacity to moderate hate crime and incitement to violence as effectively as they need to,’ he said.                    

    ‘But most take their responsibilities very seriously, and many seek the support of Tech Against Terrorism to prevent exploitation by bad actors.’                

    Mr Hadley said a ‘small proportion’ of apps had been created for ‘narrow political purposes’. ‘In my view this potentially creates fertile ground for terrorists and violent extremists to spread harmful content that could lead to radicalisation and violence,’ he said.”

What’s up next?

  • Season 2 of the Tech Against Terrorism Podcast will continue this month, with a podcast on the accelerationist movement. You can find all previous episodes here.
  •   
    Don’t forget to follow us on Twitter to be the first to know when a webinar is announced, or a podcast is released!


Tech Against Terrorism Reader’s Digest – 7 August

Our weekly review of articles on terrorist and violent extremist use of the internet, counterterrorism, digital rights, and tech policy.


Top stories:

  • US President Trump on 6 August issued two Executive Orders banning dealings with the owners of TikTok and messaging app WeChat, a ban which could see the apps removed from US distribution. Both bans would come into effect in 45 days.

  • Facebook has removed one of the largest QAnon groups present on the platform. The group, with nearly 200,000 members, acted as the official account of QAnon on Facebook.

  • AccessNow has said that there have been 53 Internet shutdowns by governments across 15 countries in 2020. Find out more here.

  • Irene Khan is the new Special Rapporteur on Freedom of Expression, a role she is taking over from David Kaye. Earlier this week several civil society groups welcomed her appointment.

  • Global Voices has announced a new series on content moderation in the Global South – “a series of articles which is meant to bridge the divide between the ongoing policy debates around content moderation, and the people who are most impacted by them.”

  • Ben Makuch reports on the new neo-Nazi group National Socialist Order (NSO), born out of the infamous Atomwaffen Division – which reportedly dismantled in March.

Far-right violent extremism and terrorism:

  • Body bags and enemy lists: how far-right police officers and ex-soldiers planned for ‘Day x’: In this article, Katrin Bennhold tells the tale of Nordkreux, or Northern Cross – a German far-right group which grew out of a network of current and former German soldiers and police officers – and of its preparation for “Day X”, the day when the German social order would collapse and see far-right extremists “save the nation”. Bennhold looks at the origins of the group, its evolution, its supposed plot to round up migrants to kill them on “Day X”, and the ongoing police investigation targeting some members of the groups. Bennhold notes that Nordkreux is another example of current and former members of the German police and military joining far-right extremist organisations. Recently, an entire company of the German Special Forces was dismantled after it became clear that it had been infiltrated by extremists. There have also been reported cases of police computers being used to access personal information that was subsequently used by neo-Nazi groups to target individuals with death threats. (Bennhold, The New York Times, 01.08.2020).

  • Radical right terrorists are usually self-radicalized: In this article, Prof. Matthew Feldman dwells on what has emerged as a challenge to the understanding of terrorism as a “group-based” phenomenon: the rise of “lone-wolf terrorism”. Here, Feldman calls for increased research into “lone-wolf” or “self-activating” terrorism; and in doing so provides an overview of how such tactics have been the hallmark of far-right violent extremism and terrorism since the 1980s. Tracing back “lone-wolf terrorism” to anarchists movements’ “propaganda of the deed” of the 19th century, and assessing key ideologues of the violent far-right’s “leaderless resistance” (from neo-Nazi James Mason to the Ku Klux Klan’s Louis Beam), Feldman analyses how this phenomenon is finding new resonance through “online extremism”. Here, Feldman sees Norwegian terrorist Anders Behring Breivik’s manifesto as a case in point in demonstrating the cross-over between “self-activating” terrorism and use of the internet as an amplifying platform for lone actors. (Prof. Feldman, CARR, 03.08.2020).

  • Neo-Nazi terror group Atomwaffen Division re-emerges under new name: In this article, Ben Makuch reports on the newly announced neo-Nazi group “National Socialist Order” (NSO). Born out of neo-Nazi group Atomwaffen Division, which is believed to have dismantled following an FBI crackdown in early 2020, NSO demonstrates the capacity of accelerationist groups to live on and expand online despite offline efforts to counter them. An NSO leader quoted in the piece notes that the group plans to avoid law enforcement efforts by relying on encrypted email services and by conducing in-person (rather than online) recruitment. (Makuch, Vice Motherboard, 05.08.2020).

  • Facebook’s disruption of the Boogaloo network: Jonathan Lewis analyses Facebook’s clampdown on the Boogaloo movement and argues that tech company measures and coordinated designation by various nation states are the best way forward to deal with the online threat posed by the movement. Facebook clamped down on members belonging to the Boogaloo movement, with 220 Facebook accounts, 95 Instagram accounts, 106 groups, and 28 pages being taken offline, following an update of the company’s dangerous organisations policy. In addition, Lewis argues that counterterrorism, including the online sphere, is inherently the responsibility of the public realm. Lewis therefore argues for centralised designation of far-right terrorist groups by quoting Tech Against Terrorism Director Adam Hadley in saying that “designating far-right organisations as terrorist groups would help the smaller platforms that are most vulnerable to extreme far-right exploitation by giving them the legal protection they need to remove content unchallenged.” Lewis adds that tech companies should also do their part through coordinated efforts between larger and smaller tech companies, including via the Global Internet Forum to Counter Terrorism (GIFCT). (Lewis, GNET, 05.20.2020).

  • This week, we’re listening to the Lawfare Podcast’s Arbiters of Truth series and their episode on QAnon, the real-world impact it is having, and what tech companies are doing about it. Listen here.
  •  
This week, we’re listening to the Lawfare Podcast’s Arbiters of Truth series and their episode on QAnon, the real-world impact the theory is having, and what tech companies are doing about it. Listen here.

Islamist terrorism

  • ISIS and the Militant Jihad on Instagram: Anne Speckhard sheds light on Islamic State’s (IS) and other Islamist terrorist groups’ presence on Instagram. The article warns that the online Islamist landscape on Instagram has largely been overlooked. In her analysis, Speckhard distinguishes between private and public accounts. The private accounts are more explicitly violent, disseminating propaganda that propagate terrorism but largely omit expressing allegiance to IS. However, Speckhard identifies some accounts as being outright pro-IS, most commonly accounts ran by women from the al-Hol camp in Syria. Public accounts are, on the other hand, used to express views that might not be extremist in nature but are subsequently shared and utilised by IS supporters. Speckhard adds that Instagram provides a good opportunity for deploying counternarratives to limit the influence of terrorist propaganda. According to Speckhard, such counternarratives should holistically focus on the Islamist terrorist online presence rather than on IS as a group, and make use of short videos that are emotionally evocative. (Speckhard, Modern Diplomacy, 03.07.20).

  • The digital fight against Islamic State and al-Qaeda: In this article, Rida Lyammouri and Youssef Eddazi discuss tech companies’ measures to counter the presence of al-Qaeda and the Islamic State (IS) online, and these groups’ ability to maintain and even expand their online presence despite such efforts. With Facebook, Twitter, Microsoft, and YouTube harmonising their strategies to combat al-Qaeda’ and IS’ use of their platforms, supporters have metastasised to other platforms, such as Telegram, although a joint Europol and Telegram operation has seen their presence on the platform destabilise. However, Lyammouri and Eddazi warn that IS and al-Qaeda have branched out to smaller tech companies. The authors conclude that it is essential to identify these new platforms to detect their propaganda and to prevent al-Qaeda and IS to use these for recruitment, radicalisation, and fundraising purposes. (Lyammouri & Eddazi, GNET, 03.08.20).

  • Islamic State’s great ‘capacity to regenerate’ itself: Mourad Kamel discusses the regrouping and potential resurgence of the Islamic State under its new leader Abu Ibrahim al-Hashimi al-Qurashi. Under al-Qurashi’s leadership, Kamer warns that an IS resurgence is on the horizon in Iraq and Syria, made possible by the group’s weapon collection, exertion of control over their main branch and affiliates, and the reclaiming of land to establish an economy that can finance their operations. Kamel concludes that the instability in Iraq, the ongoing conflict in Syria, and Turkey’s operations across the Syrian border are set to provide fruitful ground for such a resurgence. (Kamel, the African Project, 31.07.20).

For any questions or media requests, please get in touch via:
[email protected]


Background to Tech Against Terrorism

Tech Against Terrorism is an initiative supporting the global technology sector in responding to terrorist use of the internet whilst respecting human rights, and we work to promote public-private partnerships to mitigate this threat. Our research shows that terrorist groups – both jihadist and far-right terrorists – consistently exploit smaller tech platforms when disseminating propaganda. At Tech Against Terrorism, our mission is to support smaller tech companies in tackling this threat whilst respecting human rights and to provide companies with practical tools to facilitate this process. As a public-private partnership, the initiative works with the United Nations Counter Terrorism Executive Directorate (UN CTED) and has been supported by the Global Internet Forum to Counter Terrorism (GIFCT) and the governments of Spain, Switzerland, the Republic of Korea, and Canada.