For any questions or media requests, please get in touch via:
[email protected]

Until the United Kingdom Online Harms regime – aiming to counter harmful content online and announced in a White Paper in April 2019 – is implemented, there will be an interim regime with UK communications regulatory body Ofcom acting as regulator for online Video-Sharing Platforms (VSPs) in order to meet the UK’s obligations under the EU’s Audiovisual Media Services Directive (AVMSD) 2018.

Ofcom will be given new powers to regulate UK-established VSPs. This includes ensuring that VSPs have appropriate measures in place to protect users from illegal content as well as incitement to hatred and violence.

Ofcom opened a call for evidence to inform its guidance for UK-established VSPs, which closed on 24 September.

Our arguments throughout our response can be summarised as follows:

  • Accountability – Governments need to provide more leadership and strategic thinking in tackling terrorist use of the internet and not place the onus on private companies.
  • Rule of Law – Counterterrorism and tackling online harms need to be based on the rule of law and pay due regard for human rights, in particular freedom of expression.
  • Transparency – We encourage both governments and tech companies to be transparent in their efforts to counter terrorist use of the internet.

Our response highlights:

Uncertainty around definitions and illegality

Terrorism is a contested term without a universally accepted definition. Whilst the local jurisdiction of the Ofcom regulatory scheme removes some complexity around definitions, the implications of the scheme may, due to the international user bases of the VSPs in question, still have global consequences.

One of our observations when engaging with tech companies is that they struggle with moderating content on their sites due to definitional uncertainty. One way to increase clarity around what constitutes terrorism, and by extension terrorist content, is to improve designation and other legal instruments.

However, we also note that even when content clearly depicts or is created by a designated terrorist group, smaller VSPs and tech companies might struggle to adjudicate on its illegality, since it is often unclear how a specific piece of terrorist content relates to specific legislation. We encourage Ofcom to take such difficulties into account and to identify measures that can help companies in such adjudication.

Moreover, it may be difficult for companies to assess whether content is terrorist propaganda, or whether it is part of journalistic reporting or academic research on terrorism, or civil society efforts to collect evidence of war crimes and human rights abuses. When platforms fail to make this distinction, they are often criticised; however, as of yet there are no clear guidelines to assist platforms on how to make these decisions, particularly when their audiences are international. We encourage Ofcom to take this into consideration so as to not unduly punish smaller VSPs who struggle to make this distinction.

Small VSPs

We find that the VSPs that struggle to handle abuse of their platforms are overwhelmingly smaller VSPs. For reference, a micro-platform consists of 1-2 staff, and a small platform can be everything upwards to 50-100 staff.

Members of small teams are less likely to have discrete monitoring duties and are more likely to struggle to swiftly remove content at scale. This may also be true for some larger VSPs, who despite having a user base of millions may only have a Trust and Safety team of less than half a dozen. Video is also difficult to moderate quickly without automated systems, and small VSPs will struggle to build these systems on their own.

Terrorists are aware of this and deliberately target smaller platforms, including VSPs, since they know that they might struggle to take effective action on terrorist activity. This is one reason why a majority of terrorist content is located on smaller platforms.

Placing requirements that are impossible for smaller VSPs to meet risks pushing them either out of business or out of the UK, and by extension harming innovation and competition. 

Recommendations

We recommend that Ofcom:

  1. Anchors regulation in the rule of law and ensure that it does not promote removal of legal speech via extra-legal means.

  2. Acknowledges the size of VSPs when drafting corporate transparency requirements and approaches this issue with sufficient regard to proportionality.

  3. Addresses the limited capacity of smaller VSPs to institute redress mechanisms, and that any penalty for the failure to introduce an effective redress mechanism should contain an exemption criterion according to the size of the VSP workforce.

  4. Ensures that there is a support plan for smaller companies that might struggle to meet requirements so as to not harm innovation and competition.

  5. Encourages the Government to ensure designation lists are updated and robust. We also recommend that Ofcom consults the Consolidated United Nations Security Council Sanctions list, as it provides the best international consensus framework on terrorist groups.

  6. Encourages government transparency about its online counterterrorism measures.

  7. Supports more responsible publication codes from mainstream media, who by sharing terrorist propaganda can help ensure content longevity online and undermine tech sector efforts to tackle such content.

Further resources

For further Tech Against Terrorism resources on this topic, please see our:


For any questions or media requests, please get in touch via:
[email protected]


Background to Tech Against Terrorism

Tech Against Terrorism is an initiative supporting the global technology sector in responding to terrorist use of the internet whilst respecting human rights, and we work to promote public-private partnerships to mitigate this threat. Our research shows that terrorist groups – both Islamist and far-right terrorists – consistently exploit smaller tech platforms when disseminating propaganda. At Tech Against Terrorism, our mission is to support smaller tech companies in tackling this threat whilst respecting human rights and to provide companies with practical tools to facilitate this process. As a public-private partnership, the initiative works with the UN and the Global Internet Forum to Counter Terrorism (GIFCT) and has been supported by the governments of Spain, Switzerland, the Republic of Korea, and Canada.