6 min read

Reader's Digest – 11 September 2020

Reader's Digest – 11 September 2020

Our weekly review of articles on terrorist and violent extremist use of the internet, counterterrorism, digital rights, and tech policy.


Top Stories

  • C-REX, the Center for Research on Extremism at the University of Oslo, has released a new compendium called Knowing what’s (far) right that provides an overview of the far-right extremist scene, particularly in western countries. 

  • Politico has reviewed two draft versions of this year’s US Department of Homeland Security report on the domestic US terrorism threat. Both drafts highlight that  white supremacists represent the most important threat in the US. 

  • Michelle Bachelet, the UN High Commissioner for Human Rights, has sent a letter on the EU Digital Services Act  to Ursula von der Leyen, President of the European Commission, calling for the EU to root the DSA in international human rights law and to ensure the safeguarding of freedom of expression and privacy. 

  • TikTok has announced that it has joined the EU’s Code of Conduct on Countering Illegal Hate Speech. In a statement, TikTok’s head of trust and safety for EMEA stated: “We have never allowed hate on TikTok, and we believe it’s important that internet platforms are held to account on an issue as crucial as this."


Tech policy

  • Regulating social media: the fight over Section 230 – and beyond.  In this report, Paul M. Barrett analyses the ongoing US debate around modifying Section 230 of Communications Decency Act, which protects tech companies from liability for content posted by users on their platforms. With Section 230 currently being “a focal point in a heated debate about U.S government’s role in regulating how social media sites handle content”, Barrett reviews the history of this landmark law before examining critiques against it and recent legislative proposals introduced to amend it. Barrett also looks at reform proposals made by civil society actors. Based on this review, Barrett argues for Section 230 to be preserved but improved. For instance, according to Barrett, Section 230 could be improved by being used as a leverage to push platforms to accept “a range of new responsibilities related to policing content”, with a refusal of such responsibilities revoking a platform’s protection from liability. Barrett also advocates for the creation of a Digital Regulatory Agency to respond to the ongoing “crisis of trust” in tech companies’ ability to monitor their platforms. According to Barrett this agency could supervise the enforcement of a revised Section 230. (Barrett, NYU Stern Center for Business and Human Rights, September 2020)
  • Video unavailable’: social media platforms remove evidence of war crimes. In a new report, Human Rights Watch (HRW) warns against the negative implications of tech companies' removal of terrorist and other violent content, as this has led to evidence that could be used to prosecute perpetrators of war crimes and terrorism being removed. Whilst national law enforcement agencies can request companies to preserve and share removed content, international bodies such as the International Criminal Court have limited ability to make such requests since they lack law enforcement standing. To allow for social media content to serve as evidence, HRW proposes a new independent mechanism to support the archiving of content removed by tech companies under counterterrorism policies, and notes that Tech Against Terrorism’s Terrorist Content Analytics Platform (TCAP) is well-placed to potentially serve this purpose. HRW concludes by urging social media companies and other relevant stakeholders to launch a consultation process around which principles and framework should guide the independent archiving mechanism. (Human Rights Watch, 10 September 2020)
  • If lawmakers don't like platforms' speech rules, here's what they can do about it. Spoiler: the options aren't great. Daphne Keller explores potential regulatory approaches for platforms’ management of material that is lawful, but widely disapproved and often restricted under platforms’ Terms of Service. Keller introduces the content removal debate and notes that both sides – those advocating in favour of platforms taking down less content and those arguing they should take down more – have recently turned to courts and legislatures, either looking to impose “must-remove” (requiring platforms to remove or demote currently lawful speech) or “must-carry” laws (hindering platforms from removing or disfavouring lawful speech). She briefly discusses the history of “must-carry” and “must-remove” rules that existed for older means of communications, suggesting that there is a lot to learn from this history. Keller then lays out what the different “must-carry” or “must-remove” laws may look like as well as their potential implications. In doing so, she examines five possible approaches that would change the rules platforms apply to their users’ legal speech: “common carriage” rules for major platforms, indecency rules, fairness rules, amplification rules, and rules for dominant platforms. Keller also assesses proposals that would shift the decision-making away from the platforms, and either directly give users control over what they see, or enable competing third party tools to provide alternative experiences for users to select rule sets or filters for managing the platform content. Keller recognises that each regulatory approach has its downsides and stresses that they need a more thorough investigation by legal experts. She concludes by calling for further debate on this topic, especially at the intersections of competition and speech law. (Keller, Techdirt, 09.09.2020)
  •   


Islamist terrorism

  • We are already behind enemy lines: lessons from the secretive online fight against the Islamic StateLen Anderson, Nathan Fleischaker, and Brian Russel take a deep dive into the US Cyber Command’s fight against the Islamic State (IS). More precisely, they focus on Operation Glowing Symphony, a task force set up in 2016 and designed to tackle IS’ information system, both within and outside combat zones using a “combined application of physical and informational power”. Through this case study, the authors demonstrate the capacity of task force operations (in this case led by a Marine unit) to engage both in more traditional combat zones and in “gray zones” (activities outside open-conflict areas) in order to achieve political and strategic objectives. In particular, their review of Glowing Symphony stresses the importance of conducting security operations in the information environment in “gray zones”, as Glowing Symphony permitted the US miltary to gain an operational advantage over IS through an overall understanding of the group’s networks and capacities on a global scale. The authors conclude by arguing for such strategy of global engagement, online and offline, to be replicated in other operations when engaging with US adversaries. (Anderson, Fleischaker, and Russel, War on the Rocks, 07.09.2020)

  • On Islamist terrorism, we have also been watching:
    Central Asia: The Call of Daesh ­– Asie Centrale, l’Appel de Daesh. Available in French and German, Arte.


Counterterrorism 

  • Indian states, deradicalisation and online platforms. In recent years, India has incorporated “systemic deradicalisation programs” in its counterterrorism approach, leading many Indian states to have their own programmes. Drawing on examples of deradicalisation programmes implemented in the states of Maharashtra and Kerala, Kabir Taneja underlines the existence of “an ever-dynamic gap between technological advancement and public and security policy”. This gap contrasts the potential of the online space as an avenue for radicalisation, with law enforcement and policing authorities still trying to comprehend and incorporate the online threat angle in their approach to deradicalisation. Taneja argues that a more holistic approach to deradicalisation and counternarratives, bridging the online and offline eco-systems, would better equip law enforcement and policing authorities with “knowledge and data of how online radical and terror ecosystems behave” to counter the terrorist threat. (Taneja, GNET, 08.09.2020)

  • Terror and Technology from Dynamite to Drones. In this book review, T. X. Hammes discusses Audrey Kurth Cronin’s book Power to the People: How Open Technological Innovation is Arming Tomorrow’s Terrorists. Hammes begins by contextualising insurgents and terrorists’ use of openly available commercial technology that, according to him, has consistently caught advanced militaries off-guard. He then explores Cronin’s historical case studies of dynamite and AK-47s, which portray how the commercial utility and diffusion of technology have proved to be critical for use by non-state actors. Addressing the present, particularly how terrorists and insurgents have exploited social and global communications networks as well as emerging lethal innovations, Hammes links newer commercial technologies to recent terrorist activities. He also refers to Cronin’s list of characteristics non-state actors seek in innovative weapons, such as, being accessible, cheap, concealable, and effective in a wide range of contexts. According to Hammes, the full list provides a helpful guide for considering how terrorists may use new technology and what states could do to avert it. Hammes stresses the importance of Cronin’s book for students and observers of national security and emphasises that, in a time of accelerated technological change, understanding the nexus of terror and technology may prevent increasingly deadly attacks. (Hammes, War on the Rocks, 04.09.2020)

  • To Prevent Extremist Violence in the United States, Think Beyond the Homeland Security Box.Eric Rosand and Stevan Weine assess the United States’ efforts in preventing violent extremism and draw on international examples to provide recommendations for improvement. Rosand and Weine explain how, in the past decade, the US has depended on the Department of Homeland Security to work with local law enforcement, municipalities, and communities to improve their capability to prevent violent extremism. While under the Obama Administration the community-based projects focused particularly on Muslim communities and lacked cooperation from community groups who felt discriminated against by such targeted methods, the Trump Administration introduced a framework that included both white supremacist and non-ideologically driven violence. Rosand and Weine continue by arguing that the most promising strategies can be found in partnerships led by actors not involved in security. Therefore, they stress the need to transition from Homeland Security to new prevention models, in particular by involving the Department of Health and Human Services. In doing so, they discuss counter violent extremism practices in other countries, such as Denmark and Germany, that have succeeded in developing local partnerships, and recommend that the US draws on such models. (Rosand and Weine, Lawfare, 06.09.2020)


For any questions, please get in touch via:

contact@techagainstterrorism.org