On 15 July, Tech Against Terrorism organised a webinar together with the GIFCT, discussing the importance of accountability mechanisms for tech platforms and sharing best practice and lessons learned in this regard.


Zoe Darmé, Business Programme Manager, Microsoft (GIFCT Chair for 2020)

Jillian York, Director for International Freedom of Expression, Electronic Frontier Foundation

Dina Hussein, Counter Terrorism and Dangerous Organisations, Facebook 

Sean Li, Director of Trust & Safety, Discord

The session was moderated by Tech Against Terrorism’s Maygane Janin.  

Tech company content moderation decisions have a significant impact on online speech and freedom of expression, it is therefore essential that the tech industry, including smaller companies, have the tools to introduce appropriate and proportionate accountability mechanisms. Understanding which mechanisms should be employed, and therefore how and why tech companies should be held accountable for content removal decisions, is essential to creating robust processes and procedures that everyone, both within the tech industry and in the wider community, can feel confident in.

In this webinar, leading experts from the GIFCT, tech platforms, and civil society look at the issues involved in developing robust accountability mechanisms for tech platforms, and address the importance of developing fair processes to ensure online freedom of expression. To do so, they share their insights on some of the benefits and challenges involved in developing strong accountability mechanisms, and explore practical ways in which tech companies can achieve best practice in this area. 

This webinar thus looks at accountability in a comprehensive manner, discussing the different mechanisms and thought processes essential for tech companies to ensure accountability on their platforms: from transparency reporting and clearly laid out community guidelines, to smooth and easy to access user appeal processes and the importance of product and policy development. 

Our panellists also discuss key challenges of accountability mechanisms. For instance, they discuss the extent to which users should be given notice that their content is going to be taken down and given the right to appeal against the decision. In particular, they dwell on whether users should be allowed to appeal the removal of terrorist content. Our panellists also discuss transparency reporting, its importance for accountability and the extent to which tech companies should be required to disclose how many posts they have removed and why. 

Accountability for tech platforms is also made more complicated by the fact that there is no globally agreed definition of terrorism. Accountability mechanisms thus need to balance both content moderation and the safeguarding of freedom of expression. A balance which requires careful thought on all sides about how this can be achieved.

Finally, this webinar also considers the role that civil and society can and should play in holding tech companies to account. Whether it is by being directly  involved in the thought and enforcement processes behind accountability mechanisms, or by  ensuring that they are able to keep tech companies in check when it comes to online expression and accountability. 

To access a recording of the webinar, please get in touch on [email protected]