NOVEMBER 2021 UPDATE TO THE EUROPEAN UNION’S REGULATORY FRAMEWORK
Tech Against Terrorism first analysed the EU’s online regulatory framework in the first edition of the Online Regulation Series 2020, and we subsequently updated our analysis for the Online Regulation Series Handbook in July 2021. In this entry, we provide updates on further developments in the last six months and add resources that were not included in the initial version of our analysis.
The regulation on addressing the dissemination on terrorist content online (passed)
The EU’s Regulation 2021/784 on addressing the dissemination of terrorist content online (often called the “TCO” or “TERREG”) was enacted in June of 2021, and will come into force in June 2022. In October 2021, the EU Commission commendably announced that, as was recommended by Tech Against Terrorism in our response to the regulation, it will fund initiatives supporting smaller platforms to comply with the regulation. It was also announced that Europol’s PERCI tool – a platform built to facilitate the implementation of the TCO – will be used by Member States’ ‘competent authorities’ (see below) to issue removal orders.
Seven UN Special Rapporteurs once again voiced criticism of the regulation as part of a wider critique of the EU’s Counter-Terrorism Agenda (see below). They warned that the definition employed in the TCO regulation could conceivably encompass legitimate forms of expression. The Rapporteurs called on the EU to provide further information on the definition of ‘terrorist content’ and to explain its “operational consistency with human rights norms of legal precision”. The criticisms made of the TCO before it received legislative approval can be found below.
Digital Services Act (proposed)
Following a first draft published in December 2020, the Digital Services Act (DSA) has undergone scrutiny both in the European Parliament and the Council of the EU.
In Parliament, the Internal Market and Consumer Protection (IMCO) Committee has primary responsibility for scrutinising the DSA. The Committee on Civil Liberties, Justice and Home Affairs (LIBE), the Committee for Industry, Research and Energy (ITRE) and the Legal Affairs Committee (JURI) are associated committees for the purpose of scrutinising the DSA and adopted their positions on the DSA in September and October 2021.
The DSA received extensive scrutiny in the European Parliament, in the course of which the various committees proposed more than 2,000 amendments to the original draft. Whilst some amendments focus on expanding the scope of the law and on increasing transparency requirements, others propose content removal deadlines which were absent in the first draft of the law and are similar to those found in other regulations, including the TCO. Amendments adopted by the JURI committee, for example, propose a removal deadline of 30 minutes for illegal content disseminated by live broadcast,  and 24 hours for illegal content which “can seriously harm public policy [and] public security” (this amendment was broadly mirrored in IMCO’s draft report).
Digital rights groups have raised concerns about these developments in Parliament, arguing that they risk undermining what many such groups saw as a promising start in the first draft of the DSA. The Electronic Frontier Foundation (EFF) said that amendments like the tight removal deadlines could increase reliance on content filters, which according to EFF will lead to the “overblocking” of content. Further, Brussels-based digital rights network Edri warned that some of the parliamentary amendments risk seeing the DSA commit to what in Edri’s view are instruments flawed in the manner of those introduced by the TCO regulation. Edri has also warned that this would lead to a lack of independent judicial oversight, “unjustified” removal deadlines, and would incentivise an increased reliance on platforms’ Terms of Service as a basis for content removal at the exclusion of the rule of law.
The IMCO Committee is currently working on finalising its joint position, which it is expected to do in December 2021.
Meanwhile, the Council announced that it adopted its general approach to the DSA on 25 November 2021. The Council’s approach proposes fewer amendments than were tabled in Parliament, but the approach nonetheless seeks to include online search engines in the scope of the DSA and to strengthen public authorities’ ability to issue orders concerning illegal content to private companies.
The DSA, following the finalisation of the European Parliament’s position on the draft, will become the subject of the tripartite negotiations between the Commission, Parliament, and the Council known as the ‘trilogue’.
Measures to tackle criminal use of end-to-end encryption
Over the past year, there have been significant developments in EU initiatives that could have an impact on end-to-end encryption (E2EE). Terrorist use of such technology is often cited by the EU as justification for proposed action or regulation of E2EE, although child sexual abuse material is the reason most frequently mentioned.
The Council resolution “Security through encryption and security despite encryption”,published in November 2020, calls on EU institutions to propose a legislative framework for legal law enforcement access to encrypted communications – mainly to counter terrorist activity and child sexual abuse material – and for facilitated cooperation with service providers. Further, the EU strategy for a more effective fight against child sexual abuse, published in December 2020, lays out eight initiatives meant to provide a framework for improving action on child sexual abuse, and notes that the use of end-to-end encryption makes detection of such crimes “impossible”. The strategy also tasks the EU Internet Forum (see below) with assessing technical solutions to detect and report child sexual abuse. Further, a leaked 2020 Commission report suggests that the EU is seriously exploring technical solutions to target criminal use of E2EE. This was reiterated in January 2021 by the EU Commissioner for Home Affairs, Ylva Johansson who also confirmed that the Commission is exploring “permanent legislation” to counter child sexual abuse material online. In December 2021, Johansson announced that a proposal on this matter is going to be unveiled in early 2022. It is currently unclear to what extent such regulation would impact E2EE.
For more information about the EU’s regulatory approaches to criminal use of E2EE, as well as the approaches of other jurisdictions, see Tech Against Terrorism’s report on “Terrorist use of E2EE: State of Play, Misconceptions, and Mitigation Strategies.”
THE ONLINE REGULATION SERIES 2020 | EUROPEAN UNION
The European Union (EU) is an influential voice in the global debate on regulation of online speech. For that reason, two upcoming regulatory regimes might – in addition to shaping EU digital policy – create global precedents for how to regulate both online speech generally and terrorist content specifically.
European Union’s regulatory framework:
Key organisations and forums:
Key takeaways for tech platforms:
Regulation on addressing the dissemination of terrorist content online (from June 2022)
Digital Services Act: Proposed measures
Tech Against Terrorism’s Commentary
Regulation on addressing the dissemination of terrorist content online
In September 2018, the EU Commission introduced proposed “regulation on preventing the dissemination of terrorist content online”. The regulation has since undergone the EU’s legislative trilogue process of negotiation between the Commission, Parliament, and the Council. The Commission’s proposal drew criticism from academics, experts, and civil society groups. Further, the proposed regulation was criticised by three separate UN Special Rapporteurs, the Council of Europe, and the EU’s own Fundamental Rights Agency, which said that the proposal is in possible violation of the EU Charter for Fundamental Rights. Criticism mainly concerns the short removal deadline and the proactive measures instrument, which according to critics will lead to companies erring on the side of removal to avoid penalty fees. The EU Parliament’s reading of the proposal, unveiled in April 2019, provided some changes, for example by deleting the referral instrument and limiting the scope of the regulation to “public” dissemination of terrorist content to avoid covering private communications and cloud infrastructure. These changes were largely welcomed by civil society groups. In April 2021, the regulation was approved by parliament following Council review. The law entered into force on 7 June 2021 and will be fully applicable from 7 June 2022.
At Tech Against Terrorism, we have since the publication of the original proposal highlighted our concerns over smaller platform capacity and effectiveness in achieving its intended purpose of tackling terrorist content online and creating a safer EU. We have also shared several of the freedom of expression concerns that have been raised by civil society groups. At the publication of the final regulation in 2021, we reiterated these concerns, and highlighted that the regulation provides little legal certainty for platforms. It also offers little clarity on how smaller platforms will be supported in tackling this threat, and in complying with the regulation.
We warned that this is a misjudgement from the EU that casts further doubt over what evidence basis underpins the regulation. As a result, we fear that the regulation will do little to achieve its intended purpose of tackling terrorist use of the internet and risks harming innovation and competition in the process. Furthermore, the EU should clarify what safeguards are in place to avoid authorities abusing their position, and consider the incentives the law creates and what this means for the objectives the EU has set out in its overall tech strategy and the Digital Services Act.
See above sections for further information that the EU Commission has shared in this regard that partially addresses some of the concerns raised above.
The Digital Services Act
The Digital Services Act is an ambitious proposal aimed at introducing several new regimes to tackle illegal and harmful online content in the EU. It was relatively well-received by civil society groups, with groups commending the focus on transparency, accountability, size-oriented obligations, and that the liability scheme set out on the E-Commerce Directive remains largely intact, albeit slightly modified. However civil society groups criticised certain aspects of the proposal. For example, Electronic Frontier Foundation criticised the fact that notices in Article 14 equals actual knowledge, noting that this may lead companies to err on the side of removal, since such a notice can make them liable. This mechanism might force companies, as opposed to courts, to act as arbiters of legality. Digital rights organisation Article19 added that there are not sufficient human rights safeguards built into the risk assessment and audit provisions assigned to very large online platforms.
At Tech Against Terrorism, we noted that the DSA has several positive aspects, but it is unlikely that it will contribute to preventing terrorist use of the internet. In our assessment, the DSA is – whilst more balanced than other regulation – part of a global trend in which governments and inter-governmental bodies implement mechanisms that risk undermining the rule of law. Furthermore, despite claiming to want the opposite, the DSA may give private and democratically unaccountable tech platforms more power over online speech. The DSA is also part of another global trend in that it risks leading to increased extra-territorial enforcement of national law. There are risks that the DSA will lead to a more fragmented regulatory landscape in the EU, rather than harmonising it.
In our response to the draft DSA, we highlighted that governments and bodies like the EU should provide strategic leadership on matters related to terrorism, both online and offline. The DSA in our view, does not do that, and (whilst containing several commendable aspects) focusses on the wrong issues in terms of tackling terrorist use of the internet. Instead, policymakers should focus on improving designation of far-right terrorist groups, supporting smaller tech companies (where most terrorist activity is located) in tackling terrorist use of their platforms, and formulate effective and human rights compliant approaches to tackle terrorist operated websites.
Annex 1. EU couterterrorism strategy
The EU’s Counter Terrorism Strategy, launched in 2005, provides a framework for the Union to respond to terrorism across four strands: prevent, protect, pursue, and respond. Whilst the strategy does not focus on terrorist use of the internet, it does mention the need to counter this as part of its “prevent” strand.
Many of the texts and bodies involved in tackling terrorist use of the internet in the EU were introduced around 2015. In April of 2015, the EU adopted the European Agenda on Security, which addresses preventing terrorism and radicalisation that leads to terrorism at length, including terrorist use of the internet. The Agenda also committed the EU to setting up two collaborative schemes: Europol’s EU Internet Referral Unit (EU IRU) and the EU Internet Forum. This framework was further developed upon in the 2020 EU Security Union Strategy.
The key regulatory document guiding the EU-wide counterterrorism response is Directive 2017/451 (also known as the “Terrorism Directive”). The Directive replaced previous texts and provides definitions of key terms, including of “terrorist groups,” “terrorist offences”, and terrorist propaganda (“public provocation to commit a terrorist offence”). The Directive was partly introduced to better reflect the need to tackle terrorist use of the internet, and lays down guidelines for Member States to address this threat. For example, the Directive instructs Member States to ensure “prompt removal” of online terrorist content, whilst stressing that such efforts should be based on an “adequate level of legal certainty” and ensure that there are appropriate redress mechanisms in place.
Annex 2. Online terrorist content: foundational regulation
The main legal act outlining tech company responsibilities with regards to illegal and harmful content is the E-Commerce Directive of 2000. Whilst initially meant to break down obstacles to cross-border online services in the EU, the E-Commerce Directive also exempts tech companies from liability for illegal content (including terrorist content) that users create and share on their platforms, provided they act “expeditiously” to remove it. Further, Article 15 outlines that tech companies have no obligation to monitor their platforms for illegal content. This arrangement is being reconsidered by the EU, both through the proposed Regulation to combat online terrorist content and the Digital Services Act.
In 2018, the EU updated its Audio-Visual Media Services Directive (AVMSD), which governs Union-wide coordination of national legislation on audio-visual services (such as television broadcasts), to include online video-sharing platforms (VSPs). It encourages Member States to ensure that VSPs under their jurisdiction comply with the requirements set out in the AVMSD, including preventing the dissemination of terrorist content. In a Communication, the European Commission specified that VSP status primarily concerns platforms who either have the sharing of user-generated video content as its main purpose or as one of its core purposes. This means that in theory, the AVMSD could apply to social media platforms on which videos are shared, including livestreaming functions.
Annex 3. EU-led voluntary collaborative forums to tackle terrorist use of the internet
 UN Special Rapporteur on the promotion and protection of human rights and fundamental freedoms while countering terrorism, Fionnuala Ní Aoláin; UN Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression, Irene Khan; UN Special Rapporteur on the rights to freedom of peaceful assembly and of association Clement Nyaletsossi Voule; UN Special Rapporteur on the human rights of migrants, Felipe González Morales; UN Special Rapporteur on minority issues Fernand de Varennes; UN Special Rapporteur on the right to privacy, Ana Brian Nougrères; and UN Special Rapporteur on contemporary forms of racism, racial discrimination, xenophobia and related intolerance, E. Tendayi Achiume. All seven signed a letter addressed to EU Commission President Ursula von der Leyen, EU Home Affairs Commissioner Ylva Johansson, EU Counterterrorism Coordinator Ilka Salmi, and EU Parliament President David Maria Sassoli raising concerns about the EU’s counterterrorism agenda, including the TCO regulation.
 Fionnuala Ní Aoláin;, Irene Khan; Clement Nyaletsossi Voule; Felipe González Morales; issues Fernand de Varennes; Ana Brian Nougrères; E. Tendayi Achiume (2021), “Letter addressed to EU Commission President Ursula von der Leyen, EU Home Affairs Commissioner Ylva Johansson, EU Counterterrorism Coordinator Ilka Salmi, and EU Parliament President David Maria Sassoli”.
 Edri (2021), Digital Services Act: The EDRi guide to 2,297 amendment proposals, Edri.
 The amendment mentions sporting or entertainment events as examples.
 Cory Doctorow (2021), Europe’s Digital Services Act: On a Collision Course With Human Rights, Electronic Frontier Foundation.
 Edri (2021), Warning: the EU’s Digital Services Act could repeat TERREG’s mistakes, Edri.
 Legislative train (2021), A Europe Fit for the Digital Age, European Parliament.
 Ilara Buri and Joris van Hoboken (2021), The general approach of the Council to the Digital Services Act, DSA Observatory.
 Such as Council Framework Decision 2002/475/JHA
 This has some similarity to the US Section 230 of the US Communications Decency Act, which exempts tech companies from legal liability for user-generated content located on their platforms.
 Tech Against Terrorism has participated in EUIF meetings since 2017
 By censorship creep, Citron means that online counterterrorism efforts or mechanisms risk taking on functions beyond its intended purpose, which risks leading to censorship of legal and legitimate speech online.
Hadley Adam & Berntsson Jacob (2020), “The EU’s terrorist content regulation: concerns about effectiveness and impact on smaller tech platforms”, vox-pol
Tech Against Terrorism (2020) Summary of our response to the EU Digital Services Act consultation process, Tech Against Terrorism
Kaye David, Ni Aoilain Fionnuala, Cannataci Joseph (2018) Letter from the mandates of the Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression; the Special Rapporteur on the right to privacy and the Special Rapporteur on the promotion and protection of human rights, UNOHCR
Citron Danielle (2018), “Extremist Speech, Compelled Conformity, and Censorship Creep”, Notre Dame Law Review
Keller Daphne (2019), “The EU’s terrorist content regulation: expanding the rule of platform terms of service and exporting expression restrictions from the eu’s most conservative member states”, Stanford Cyber Policy Center
Article 19, Article 19’s Recommendations for the EU Digital Services Act
AccessNow (2020), “How the Digital Services Act could hack Big Tech’s human rights problem”
Europol (2019), EU IRU 2018 transparency report
Europol (2020), EU IRU 2019 transparency report