Our weekly review of articles on terrorist and violent extremist use of the Internet, counterterrorism, digital rights, and tech policy.
– “The coronavirus and Islamic State supporters online”: With everyone talking about the coronavirus crisis, chatter on the topic has also started to appear on messaging platforms used by the Islamic State (IS) and its supporters. Though not high in number, Chelsea Daymon highlights how these posts play out within the group’s broader rhetoric and propaganda, by relying on broader themes of “victimhood” and “brutality.” Daymon also notes that the word “corona” has been appearing in the name of pro-IS channels on Telegram. This might be for a number of reasons, including an attempt by the group to capitalise on an important current topic for its own purposes, or to bypass moderation efforts from through the use of non-IS related names and avatars. (Daymon, GNET, 13.02.2020)
– On the same topic, “BBC Monitoring further sheds light on IS’ advisory” on the coronavirus outbreak. This article notably assesses the group’s advice against “entering the land of the epidemic,” which has been interpreted by Western media as an IS-issued travel ban to Europe – a misreading of the group’s statement, according to BBC Monitoring. (BBC Monitoring, 16.03.2020)
– “The vigilantes trying to take down Islamic State online”: In this article, Andrei Popoviciu takes a deep dive into both the Islamic State’s online sphere and efforts to disrupt it. Popoviciu focusses on vigilante groups made up of volunteers who track IS operations in their free time, such as Katiba des Kuffars (“Battalion of Infidels”) created in the aftermath of the 2015 Charlie Hebdo attack. The piece also takes a look at law enforcement led operations, such as Operation Glowing Symphony, the “biggest [acknowledged] cyberoffensive” by the US government that sought to collect information and dismantle IS online capacities. (Popoviciu, Middle East Eye, 16.03.2020)
– “Al-Qaeda: threat or anachronism?”: “Despite Pompeo’s proclamation [of victory over al-Qaeda], it is hard to deny that, if Osama bin Laden were alive today, he’d likely be a happy man.” Al-Qaeda (AQ) is present in more countries today than at the time of 9/11. Bruce Hoffman and Jacob Ware provide insight into the group’s trajectory and survival since 2011, the year of bin Laden’s death and of the beginning of the Arab Spring. Hoffman and Ware assess AQ and its affiliates’ capacities, as well as the possibility of a joint axis with the Islamic State. The authors conclude the article by cautioning against the belief that the fight against Islamist terrorism is over. (Hoffman & Ware, War on the Rocks, 12.03.2020)
– “U.S blacklist new Islamic State leader”: The US State Department has blacklisted Amir Muhammad Sa’id Abdal-Rahman al-Mawla, the new leader of the Islamic State (IS), as a “specially designated global terrorist.” According to the State Department, al-Mawla has overseen IS’ global operations and has taken part in the organisation of the persecution, trafficking, and slaughter of Yazidi minorities. The Wall Street Journal also reports that the State Department has blacklisted the Syrian Minister of Defence. (The Wall Street Journal, 17.03.2020)
– “Why law enforcement loves the idea of automated content moderation”: Since the Christchurch attack was livestreamed one year ago, evading Facebook’s moderation systems, tech companies and governments have been partnering to prevent such violence being livestreamed and going viral. However, Slate warns that these reinforced relationships risk being co-opted to facilitate digital surveillance. The article assesses how law enforcement can benefit from information found on social media platforms, but also how they could attempt to take advantage of content moderation efforts – especially when such moderation involves the use of new technologies. (Slate, 13.03.2020)
– “Coronavirus disrupt social media’s first line of defense”: As most human moderation cannot be conducted by moderators working from home for security and mental health reasons, Wired analyses the potential impacts of the coronavirus and work from home policies on tech companies’ content moderation processes. Companies will instead have to turn more to automated moderation that mostly focuses on keywords and image scanning. Wired stresses that increased reliance on such systems also bears the risk of impacting free speech as automated systems are more likely to increase content removal rates, including that of legitimate content. (Wired, 18.03.2020)
For any questions, please get in touch via:
Tech Against Terrorism is an initiative launched by the United Nations Counter Terrorism Executive Directorate (UN CTED) in April 2017. We support the global technology sector in responding to terrorist use of the internet whilst respecting human rights, and we work to promote public-private partnerships to mitigate this threat. Our research shows that terrorist groups – both jihadist and far-right terrorists – consistently exploit smaller tech platforms when disseminating propaganda. At Tech Against Terrorism, our mission is to support smaller tech companies in tackling this threat whilst respecting human rights and to provide companies with practical tools to facilitate this process. As a public-private partnership, the initiative has been supported by the Global Internet Forum to Counter Terrorism (GIFCT) and the governments of Spain, Switzerland, the Republic of Korea, and Canada.