Content Moderation, Transparency (Reporting) and Human Rights

Our Cyber Threats Research Centre colleagues couldn’t host an in-person TASM Conference this year, but instead organised a week of virtual events from 21 to 25 June 2021. This post is the third in a three-part series based on overviews of three of the virtual TASM panels . Read parts one and two. [Ed.]

By Lucy Brown

The Christchurch terrorist attack on the 15 March 2019, where 51 innocent lives were taken, shocked the globe as the events were livestreamed by the terrorist in an attempt to publicise his acts of despicable evil. The livestreamed attack prompted international calls for greater transparency from online service providers in instances where terrorist and violent extremist content (TVEC) was appearing on their services. However, questions were raised: how can we stop TVEC being promoted on online services? How can we moderate this content while ensuring the promotion and protection of human rights and an open, free and secure internet?

On Wednesday 23 June 2021, The Legal Innovation Lab at Swansea University was joined by an impressive panel of experts – Jeremy West of the Organisation for Economic Co-operation and Development (OECD), Maygane Janin of Tech Against Terrorism and Gabrielle Guillemin of Article 19, chaired by Dr Katy Vaughan of Swansea University – who discussed this critical issue and offered their invaluable insights on such a crucial topic. Our Content Moderation, Transparency (Reporting) and Human Rights panel delved deep into the complexities of moderating content through transparency reporting from online content-sharing services and how this impacts upon an individual’s human rights.

The protection and promotion of human rights are a fundamental consideration when moderating online content: freedom of expression, the right to privacy, freedom from discrimination, and due process are just a few examples. Gabrielle Guillemin pointed out that one of the challenges with the use of upload filters is inaccuracy, leading to the removal of difficult and controversial topics that may not be TVEC and should not be removed. This interferes with the right to freedom of expression and highlights the importance of a multi-stakeholder approach to content moderation which includes organisations from civil society, such as Article 19. Maygane Janin stressed that it is fundamental that organisations, including Tech Against Terrorism, support the tech industry to tackle terrorist exploitation of the internet whilst respecting human rights.

Transparency reporting

Jeremy West began by highlighting the importance of transparency reporting, indicating that online content-sharing services are steering in the right direction with an increasing number of services now issuing TVEC-specific transparency reports. However, it is abundantly clear that issues remain. First, inconsistent and broad definitions of ‘terrorism’ and ‘violent extremism’ result in stark differences between transparency reports, making it difficult to gauge the overall effectiveness of industry practices against TVEC from these reports. Secondly, there are vast variations between online services in terms of their size, resources and values. Arguably, regulatory strategies will differ depending on these factors which creates a disconnected web of platforms producing different transparency reports.

Finally, research has illuminated parallel problems on the governmental side. There are a number of jurisdictions that either have or are considering enacting regulations in this area that would or could require transparency reporting around TVEC; however, these regulations are not coordinated because they call for different kinds of information and metrics on different timetables. Thus, platforms operating in many jurisdictions will have an onslaught of transparency reports to produce, creating immense cost and workload. Indeed, as Jeremy emphasised, “there is a growing risk of regulatory fragmentation”.

Practical solutions

Do not despair! Our panellists indicated towards a range of solutions to combat some of the issues discussed. OECD is committed to becoming a multi-sided platform that will provide an efficient service to governments, the public and businesses by facilitating an international, multi-stakeholder consensus on standardised TVEC transparency reporting and serve as a hub for TVEC transparency reports. Indeed, this should act as a convenient and centralised web portal, combating the difficulties caused by companies producing vastly different transparency reports.

Moreover, Tech Against Terrorism has launched their Knowledge Sharing Platform (KSP) this month, which provides smaller tech companies with a collection of resources, guidelines, and recommendations to support them tackling TVEC in compliance with human rights and the rule of law, whilst increasing transparency and accountability towards their users. In this respect Maygane Janin emphasised the importance of respecting platform autonomy, in building trust and a good working relationship with tech companies. Tech Against Terrorism’s transparency guidelines extend beyond transparency reporting, taking a holistic approach whilst acknowledging platform diversity and differences in resources and capacity. The aim being to support smaller platforms to develop policies underpinning transparency reporting.

Multi-stakeholder approach

Content moderation and calls for increased transparency from online content-sharing services is therefore far from simple. Content moderation itself involves complex considerations including the impact on human rights. The panellists outlined a number of positive steps to address these issues, providing practical solutions. However, work to find solutions to TVEC is far from over and multi-stakeholder initiatives are of great importance moving forward.


Lucy Brown has recently completed her Law LLB at Swansea University, and is currently working as a research intern for CYTREC supporting the work of the Christchurch Call Advisory Network. On Twitter @CYTREC_.

Image credit: Pixabay.