11 min read

THE ONLINE REGULATION SERIES | IRELAND

Since the 2000s Ireland has emerged as the preferred destination for many leading tech companies, and for many members of the Global Internet Forum to Counter Terrorism (GIFCT) – including all founding members: Meta, Twitter, and Microsoft[mfn]The fourth GIFCT founding member, Youtube, does not have headquarters in Ireland, however, Google does.[/mfn] – to establish their EU headquarters. This concentration of tech headquarters in Dublin places Ireland in a unique position from the perspective of online regulation, in that all major tech companies will have to abide by Ireland’s forthcoming Online Safety and Media Regulation Bill. Dublin is also the overseer of tech companies’ compliance with EU regulations[mfn]To learn more about the EU online regulatory framework, see our Online Regulation Series Handbook (pp. 97 - 106). [/mfn] as a result of the “home country supervision”[mfn]This principle stipulates that “only the country where the service provider has legal residence is entitled to impose corrective measures”. However, this principle is being challenged by certain EU Member States, in particular with regard to its application for the upcoming DSA. At the forefront of this challenge to the “home country supervision” principle is France, which has been arguing that this principle should be replaced by “country of destination”, whereby a platform could be subjected to the jurisdictions of the countries in which it operates.
See: Bertuzzi Luca (2021) , Ireland draws a red line on country of origin principle in DSA, Euractiv.[/mfn] principle of the EU e-Commerce Directive, and this is to be reinforced by the upcoming Digital Service Act.[mfn]On the Digital Services Act, see our entry dedicated to the EU online regulation framework in our Online Regulation Series Handbook (pp. 97 – 106)[/mfn] Ireland is already entrusted with superintending tech companies’ compliance with the EU’s General Data Protection Regulation (GDPR).[mfn]See: Ireland Data Protection Commission; and Lillington Karlin (2020), Enforcement proves the Achilles heel for GDPR, The Irish Times.[/mfn]

Ireland’s regulatory framework:

  • Offences Against the State Acts 1939 – 1998: Ireland’s main counterterrorism legislation. The original Offences Against the State Act, passed in 1939, was mainly introduced to counter domestic terrorism in Ireland.[mfn]Ireland’s Department of Justice website states that the country’s counterterrorism framework has been primarily used to counter “the threat posed by the IRA in all its manifestations, including, latterly, the dissident republican terrorist organisations of the so-called Real IRA and Continuity IRA.”
    See: Ireland Department of Justice, Terrorism.[/mfn] However, subsequent counterterrorism legislation, including the Criminal Justice (Terrorist Offences) Act of 2005, was enacted to strengthen the country’s response to international terrorism.[mfn]Ibid.[/mfn] Irish counterterrorism legislation does not currently explicitly address terrorist use of the internet.
  • The Online Safety and Media Regulation (OSMR) Bill: first introduced in December 2020 with additional provisions announced in January 2021. The Bill aims to tackle “harmful” content online and to align Ireland with the EU’s Audiovisual Media Services Directive of 2018. Whilst the Bill mainly covers regulation of video and audio-sharing platforms in Ireland, the provisions related to countering harmful content online encompass a broader range of online platforms – including social media, file-sharing, and private messaging services. The Bill is currently in its committee stage at the Oireachtas,[mfn]The Irish Parliament.[/mfn] and a joint committee published a report on pre-legislative scrutiny of the Bill in November 2021 including 33 recommendations concerning each of the different areas addressed by the Bill.

Main regulatory bodies:

  • Media Commission: The OSMR Bill stipulates that the current Broadcasting Authority of Ireland, the regulatory body for television and radio, will be replaced by a Media Commission acting as a regulatory body for both online platforms and broadcasting, and as a supervisory body for the implementation of and compliance with the OSMR.

Key takeaways for tech companies:

Online Safety and Media Regulation Bill

  • The OSMR’s fundamental objective is to regulate video and audio-sharing platforms in Ireland, and it includes dedicated provisions for on-demand audio-visual services such as Netflix.
  • However, Part 4 of the law is dedicated to Online Safety and expands the provisions on countering criminal and harmful content to other online services – including social media, search engines, private communications, and private storage services.
  • The Bill does not specify which online services will have to comply with the provisions on online safety. Instead, the Bill provides a non-exhaustive list of services that can be designated to fall within the scope of the law, at the discretion of the Media Commission. The Bill lists different factors that the Media Commission should consider when requiring online services to comply with the OSMR in order “to take a proportionate risks-based approach to designation”. Factors for consideration include the nature and scale of the online service, as well as the fundamental rights of users and of the online services in question. 
  • A key element of the OSMR is the categorisation of the “harmful online content” that platforms will need to counter. In considering what constitutes be harmful online content, Irish legislators have opted not to provide a singular definition but instead to “enumerate definitions of categories of material that are considered to be harmful online content”. The four categories of harmful online content listed in the Bill are:
    • Material that is illegal to disseminate under Irish or EU law, which includes material containing “public provocation to commit a terrorist offence” as well as incitement to violence or hatred;
    • “Material which is likely to have the effect of intimidating, threatening, humiliating or persecuting a person to which it pertains”;
    • Material that encourages eating disorders;
    • Material that encourages self-harm or suicide.
  • Beside the above categories of content, the Bill provides that the Media Commission will be able to identify new categories of content that may be added to the list[mfn]Kirk Niamh, Farris Elizabeth, Shankar Kalpana, Ireland’s Online Safety and Media Regulation Bill: too little too soon?, UCD Centre for Digital Policy.[/mfn] by submitting proposals to the Minister concerned with the recommendation that it be adopted by the Government. Decisions on whether to include additional material to the list of harmful online content is to be approved by both Houses of the Oireachtas as soon as possible after the Media Commission has made a proposal and should be effective immediately, unless a resolution has been passed to annul it within 21 days. The Bill lists the considerations, including online users’ fundamental rights, that the Government should entertain when deciding on a proposal to include or exclude categories of content as harmful online content.
  • The Bill also specifies what material is out of scope and what material is dealt with by other laws and regulatory bodies - for instance, material violating consumer protection law or containing a defamatory statement. The Parliamentary Joint Committee has however recommended in its report on the Bill that defamatory content should be added to the list of harmful online content.
  • The Bill does not include specific provisions on how platforms are to respond to harmful online content on their services. However, it delegates to the Media Commission the power to draft Online Safety Codes (see below).
  • The Bill also includes provisions on “voluntary arrangements” to be agreed upon between the Commission and those online services not established in Ireland that the latter will either comply with or follow guidance on the Safety codes. If a platform fails to comply with a voluntary arrangement, the Commission may revoke the said arrangement.

Online Safety Codes

  • The Bill states that the Media Commission will prepare and revise (when needed) the Online Safety Codes to be observed by the designated “online services”. The Commission may also issue relevant guidance.
  • The Codes will outline requirements for tech companies to moderate content in line with the aims and provisions of the OSMR, namely concerning:
    • Minimising the availability of harmful content online;
    • Commercial communications;
    • User complaints;
    • Risks impact assessments on the availability of harmful content;
    • Reporting obligations.
  • The Media Commission will consider different factors in developing the Safety Codes, including the “nature and scale” of the online services, the fundamental rights of the users, as well as the role of public figures in the public discourse.
  • Online services will have to comply with the Safety Codes, and the Commission may require online services to report periodically on their compliance.
  • The Bill states that the Media Commission cannot oblige an online service offering interpersonal communication or private online storage services to abide by a Safety Code that does not relate to content that it would be a criminal offence to share.
  • The Commission can take the following steps to sanction non-compliance with the Safety Codes:
    • Issue an administrative financial sanction of up to approximately $23,000,000[mfn]€20,000,000[/mfn] or up to 10% of the platform’s turnover in the preceding financial year;
    • Ask Ireland’s High Court to compel an online service to take the steps the Commission deems necessary for compliance;
    • Ask the High Court to compel internet service providers to block access to an online service in Ireland.
  • Online services also need to comply with information requests from the Commission. Non-compliance with a request is considered a category 1 offence, meaning that it is punishable with fines or imprisonment.
  • The senior management board of a tech company can also be sanctioned, as bearers of secondary liability, for failing to act on a compliance or warning noticed issued by the Commission if “the offence is proved to have been committed with the consent or connivance of, or to have been attributable to any neglect/wilful neglect on the part of a person who was either”.
  • The possibility that the future Safety Codes might include provisions requiring platforms to filter or monitor content on their services is not excluded. However, the Bill does note that the e-Commerce Directive cannot impose general monitoring obligations for online services.

Media Commission

  • At the heart of the OSMR is the Media Commission established to replace the Broadcasting Authority of Ireland. The Media Commission will have enhanced powers to enforce compliance and issue sanctions on online platforms, including the power to block online services.
  • The Commission is to establish a “systematic complaint scheme” for nominated bodies, such as expert NGOs of members of the European Regulators Group for Audiovisual Media Services, to raise issues they have with a designated online service in relation to tackling harmful online content.
  • The Media Commission will have the power to delegate functions to commissions, which can include a dedicated Online Safety Commissioner.
  • The Media Commission, and more precisely a dedicated Online Safety Commissioner, will, according to reports, not be in place before 2024 – as the Bill is not expected to pass before March 2020.[mfn]Facebook does not estimate the bill to be passed before March 2022. In reaction to the slow progress of the OMSR in going through the Oireachtas consultation, Facebook has suggested assigning the Commissioner powers to another body (ex. The existing Broadcasting Authority).
    Weckler Adrian (2021), Social media networks believe slow adoption of law could delay Irish online safety commissioner until 2024, Independent.ie[/mfn]

Tech Against Terrorism’s Analysis and Commentary

Similar yet different to other online regulations

The OSMR bears similarities to other online regulations passed or proposed since 2017 in its broad scope of application, embracing as it does online service providers across the tech ecosystem, and in its requirements for platforms to act against online content that is considered harmful despite not otherwise carrying criminal liability under Irish law.

On the question of content that is harmful but otherwise legal, the OSMR is interesting in being one of the few regulatory proposals analysed in the Online Regulation Series to explicitly acknowledge that not all “harmful” content categories listed are considered criminal under domestic law. Irish policymakers further explain this difference between harmful online content and criminal content, and the reason for prohibiting non-criminal content, in the explanatory note for Part 4 of the Bill. To further clarify what is considered criminal content, Global Partners Digital (GPD) recommended that the Irish government maintain a full list of criminal offences covered by the Bill.[mfn]Global Partners Digital Response (2021), Written Submission on the General Scheme of the
Online Safety and Media Regulation Bill.
[/mfn] Even though Irish policymakers are rather explicit in stating that not all harmful content is unlawful, it nonetheless follows a key trend in online regulation by creating a differentiated regime for what is considered acceptable speech online against offline. The Irish Council for Civil Liberties expressed its concerns with the inclusion of non-illegal speech,[mfn]Kirk Niamh, Farris Elizabeth, Shankar Kalpana.[/mfn] which it considers as an infringement of fundamental rights and freedom of speech[mfn]Kirk Niamh, Farris Elizabeth, Shankar Kalpana.[/mfn]

In differentiating between content that is criminal and content that is harmful, the Bill also specifies that private communication and file-hosting services will only have to comply with the provisions of the future Safety Codes that related to criminal content. Despite this exception, concerns about the Bill’s applicability to encrypted platforms, and what the Safety Codes will expect of such platforms in acting against criminal content, nonetheless remain valid. As Tech Against Terrorism outlined in our landmark report assessing terrorist use of end-to-end encrypted (E2EE) services, any legal insistence that platforms offering E2EE systematically monitor their services for illegal content would require them to break encryption and the promise of privacy it entails, significantly infringing on online privacy and security in the process. The Safety Codes should therefore clarify what will be expected of private communication and file-hosting services, and how the Bill will safeguard the right to privacy online.[mfn]Irish Council for Civil Liberties (2021), ICCL submission to Pre-legislative scrutiny of the General Scheme of the Online Safety and Media Regulation Bill.[/mfn]

Throughout the Bill, Irish policymakers present different factors that the Commission should consider in issuing the Safety Codes and related practical guidance material, as well as in designating the online services falling within scope of the Bill. The provisions outlined in the Bill and the related explanatory notes thus state that the tackling of harmful online content is to be conducted in a proportionate manner by considering platforms’ size and services, recognising the potentially negative impact of automated moderation on users’ rights, as well as in a manner capable of safeguarding fundamental rights. In doing so, the Bill appears to have incorporated the different criticisms raised by tech sector and digital rights experts in light of other online regulations passed in recent years which have been criticised for failing to account for platforms’ capacity and for risking breaches of fundamental rights – including by mandating or encouraging the use of automated moderation tools.[mfn]Tech Against Terrorism (2021), The Online Regulation Series Handbook.[/mfn] It remains to be seen how the Media Commission will weigh these different factors when implementing and overseeing the OSMR.

Lack of clarity and risks to fundamental rights

As it stands. the OSMR Bill delegates substantial power to the future Media Commission to decide which online services are to fall within the scope of the law, and to expand the provisional schedule of what constitutes harmful content online. This creates uncertainty as to how the law will work in practice both in the present and future climate. Tech Against Terrorism acknowledges that the flexibility of the bill, in including provisions related to the future expansion of its scope, is helpful in permitting adaptation to emerging online threats. However, without the proper safeguards there is a risk of the OSMR significantly infringing on users’ fundamental rights, notably on freedom of expression, as there is no fixed limit to its scope of application. The ICCL has also raised similar concerns as to the actual role of the Online Safety Commission, which it considers that the Bill does not provide.[mfn]Irish Council for Civil Liberties (2021).[/mfn]

The ICCL, Digital Rights Ireland, and the Irish Human Rights and Equality Commission expressed their concerns about the lack of clarity in the Bill during a hearing with the Oireachtas Media Committee: “It is wholly unclear who can expect to be regulated by the proposed Media Commission and when”. According to the three organisations, the lack of specific definition and unclear scope of the Bill, as well as the broad powers allocated to the Media commission could “restrict the voices of internet users”.[mfn]Extra.ie (2021), Social media Bill could ‘seriously damage’ users’ constitutional rights.[/mfn]

GPD expressed further concerns about the “systematic complaint scheme”, which it considers likely to lead to an over-reporting of platforms “not doing enough” rather than to highlight cases where over-compliance is leading to the removal of legal and harmless content. GPD notes that this “risks creating an imbalanced picture of what online service providers are doing when it comes to compliance, focusing solely on under-compliance rather than over-compliance,” which might encourage platforms to over-remove content to prevent complaints via the “systematic complaint scheme”.

Harsh penalties and risks for tech sector diversity

The OSMR allows the Media Commission to sanction platforms with harsh penalties for non-compliance. These range from hefty administrative fines to blocking access to an online platform within Ireland and individual liability against platforms’ employees.

Tech Against Terrorism has previously expressed its concerns about the use of hefty financial sanctions for non-compliance with online regulation, which risk penalising smaller and newer platforms and incapacitating them, instead of offering them the support needed to counter terrorist content online. Fines also carry the risk of reducing competition in the tech sector if smaller platforms are financially compromised and made permanently uncompetitive, and only larger tech companies are able to pay them.

Tech Against Terrorism acknowledges that legal liability against tech platforms’ senior management is limited to secondary liability, and that regulators carry the burden of proof in establishing that such an employee culpably failed to heed a compliance or warning notice issued by the Commission: “e offence is proved to have been committed with the consent or connivance of, or to have been attributable to any neglect/wilful neglect […]”. However, as explained in the Online Regulation Series Handbook, Tech Against Terrorism warns against such provisions which risk criminalising individuals engaged in countering the diffusion of terrorist and violent extremist material, rather than those responsible for diffusing such content.

Laws on illegal and harmful content enacted in Europe often influence online regulation globally, with the inherent risk that non-democratic countries will use laws passed in Europe as a justification, or model, for stringent legal restrictions on online speech. There is thus a risk of non-democratic countries replicating provisions on legal liability for tech platforms’ employees, which may risk platforms and their employees becoming the targets of crackdowns on political dissent and non-violent speech – in particular in countries where the definitions of terrorism and violent extremism are broad.

Additional resources

PWC (2020), 7 things you need to know about the proposed Online Safety and Media Regulation Bill.

McCurry Cate (2021),  Bill to regulate online harmful content ‘damages' constitutional rights, Oireachtas committee told, The Irish News.