4 min read

Reader's Digest – 14 August 2020

Reader's Digest – 14 August 2020
Our weekly review of articles on terrorist and violent extremist use of the internet, counterterrorism, digital rights, and tech policy.


Top stories:

  • The development of the Terrorist Content Analytics Platform (TCAP) continues! We have published a report detailing the findings from the public online consultation process which can be found here. This publication is an important step in our commitment to ensuring that the platform is developed in a transparent manner whilst respecting human rights and fundamental freedoms, including freedom of speech.
     
  • Facebook has published the sixth edition of its Community Standards Enforcement Report, which details how the company enforced their content moderation policies between April 2020 and June 2020. Facebook also announced a Request for Proposal (RFP) for independent external auditors to review the metrics used in their reports, a process which will commence in 2021.
     
  • Facebook has also shared details on the three main pillars of their content moderation policy: proactive detection; automatisation; and prioritisation. They also highlight how Facebook will increasingly use automated systems as a primary step across all violations, to detect content before other methods of detection.


Islamist Terrorism

  • The Yazidi genocide: we are listening: In this article, Gina Vale reflects on the Yazidi genocide and a new campaign called Nobody's Listening. The Islamic State (IS) is responsible for the kidnapping and killing of 9,900 Yazidis, which in 2016 was termed a genocide by the United Nations Human Rights Council. Yazidi women and girls were used as sex slaves by IS members, 3,000 of which are still missing. IS also caused the displacement of 400,000 Yazidis in Iraqi Kurdistan. Vale stresses that due to the shortage of evidence of the chain of events, the victims’ stories are essential in prosecuting IS for war crimes. In order to address this issue, Vale analyses the campaign Nobody’s Listening, which uses virtual reality, art and photography to campaign for the repatriation and rights of the Yazidis. Through a virtual reality experience, one can either follow the path of a female captive, a male survivor of the Kocho Massacre (where IS killed more than 80 men in a village) or an IS fighter. Vale highlights that this campaign provides significant insight into the plight of the Yazidis and argues that this can be seen as a new form of advocacy, in which the use of technology is essential for bringing momentum to holding IS account for their crimes. (Vale, GNET, 03.08.20).
  • Islamic State coins and the power of imagined statehood: In this blogpost, Ayse Deniz Lokmanoglu analyses the use of state coins by IS at the peak of its so-called caliphatecoins which brought about a sense of statehood that empowered IS’ propaganda. Lokmanoglu explains that through the use of their own currency, IS moved from the territorial realm to the imagined realm. Namely, the coin represents different elements of IS’ ideology as well as their means of survival, therefore linking physical and non-physical characteristics of their organisation. This was brought to the global audience through IS’ use of the Internet and social media, showcasing that IS was not just an imaginary state existing online, but a real-life sovereign state with a functioning monetary system. Lokmanoglu warns that the power of this propaganda persists today, and that despite takedowns of IS content by tech companies, IS’ centralised media campaign continues to focus on state-building narratives, including the use of its own currency. (Lokmanoglu, The ISIS Reader, 28.07.20).


Far-right violent extremism and terrorism

  • The role of the Internet in facilitating violent extremism: insights from former right wing extremists: In this article, Tiana Gaudette, Dr. Ryan Scrivens, and Dr. Vivek Venkatesh discuss the role of the Internet in radicalisation, based on a study they conducted through in-depth interviews with 10 Canadian former far-right extremists. They show how one-third of their participants interviewed were first exposed to the violent extremist far-right landscape online, whilst two-thirds were initially engaged through offline settings, mostly through an introduction by an “offline” friend. Moreover, the authors find that no matter where the initial exposure occurred, the Internet  facilitates further radicalisation through interactions with content and immersion in violent far-right online networks. In addition, they find that the Internet can serve as a gateway between online and offline engagement, especially through the online promotion of offline events. Finally, they conclude that the formers they interviewed used the same identity in online and offline spaces, which contradicts research suggesting that people behave more violently online because they benefit from anonymity. However, the authors stress that this is changing amongst the new generation of violent far-right extremists, who increasingly use encrypted and anonymous platforms. (Gaudette, Scrivens and Venkatesh, GNET, 07.08.20).
     
  • The QAnon conspiracy theory: a security threat in the making?: Amarnath Amarasingam and Marc-Andre Argentino analyse the QAnon conspiracy theory and its role in radicalising individuals to commit violence. The authors highlight five cases which demonstrate the potential that QAnon has for inciting violence, and how the theory is particularly apt in radicalising vulnerable individuals who suffer from trauma and mental health problems. In addition, Amarasingam and Argentino stress that QAnon is injecting itself into mainstream discourse, with noteworthy politicians in the United States having been linked to the conspiracy theory. Finally, the authors stress how one way of ensuring this ideology does not spread and radicalise others is through content takedown by social media platforms, such as Twitter and Facebook. However, they caution that this might lead to QAnon dispersing to other platforms, which  risks marrying QAnon followers with other more extreme channels, which the authors showcase is already happening on platforms like Telegram. (Amarasingam & Argentino, Combating Terrorism Centre at West Point, July 2020).


Counterterrorism

  • Disengagement and reconciliation in conflict-affected settings: In this report, Leanne Eerdberg Steadman analyses the potential of peacebuilding tools in transforming individual attitudes, group relationships, and social ecosystems and structures that facilitate disengagement and reconciliation between former violent extremists and their communities. Steadman argues that violent extremism is a social phenomenon, and therefore bolstering individuals’ sense of community is essential for effective disengagement. In addition, she argues that in conflicts wherein “victims, bystanders and adherents” have all experienced trauma, the well-being and capacity for change of these individuals need to be addressed. Therefore, Steadman stresses that instead of counterterrorism policies analysing the potential risks that could lead to violent extremism, more focus should be given to resilience to violent extremism, in which former extremists can play an important role. Finally, Steadman emphasises that effective disengagement strategies in conflict-prone areas is often overlooked and deserves more attention from counterterrorism practitioners. (Steadman, United States Institute for Peace, 07.08.20).


Tech Policy

  • Facebook bans racist depictions of Jews and black people: Facebook announced that they are updating their hate speech to include racist depictions of “blackface” and anti-Semitism, Saim Saeed reports. The policy update particularly targets blackface (“Zwarte Piet”), a tradition in the Netherlands and Belgium that has been recognised as racist by activist groups. In addition, it aims to ban anti-Semitic tropes depicting, amongst other things, perceived Jewish “control” over key global institutions. Saeed stresses that Facebook has been under increased pressure to combat racism on their platforms following the murder of George Floyd by police officers in the United States. In its enforcement report (see top stories), Facebook said that they removed 22.5 million pieces of hate speech content, and 3.3 million pieces of content on Instagram – more than double the content removed in the first quarter of this year.  (Saeed, Politico, 11.08.20).