By Sophia Rothut, Heidi Schulze, Diana Rieger, Catherine Bouko, Brigitte Naderer
The European regulation on addressing the dissemination of terrorist content online (TCO Regulation) is, like the DSA, an approach to combating the spread of terrorist content online that is (1) transnational (EU-wide) and (2) holds Hosting Service Providers (HSPs) accountable. According to this regulation, swift removal of terrorist content becomes mandatory: If an HSP receives an official removal order from a competent authority, it must remove the respective content within one hour upon receipt. The TCO Regulation not only requires HSPs to be ready to act 24/7 but also imposes several more obligations on HSPs. Let us take a brief look at six key obligations for HSPs.
1. Draft and apply Terms of Service (ToS) prohibiting terrorist content
As a binding agreement between the user and the HSP, ToS define appropriate as well as permitted use of the platform. Regarding the TCO Regulation, two things are especially important:
- The explanation of the HSP’s strategy and measures to address the dissemination of terrorist content, including an explanation of the use and functioning of automated tools.
- The prohibition of the dissemination of terrorist content.
ToS are required by law but also help HSPs to protect themselves from legal liability and to shape their platform in a way its use is intended to be, so it is recommended to deliberately devote to the drafting of ToS to build in useful protections for the platform that are in line, but also go beyond the TCO Regulation.
2. Identify and remove terrorist content
As the core of the TCO Regulation, HSPs are now required to remove terrorist content within one hour after receiving an official removal order. Furthermore, an HSP’s ability to assess the illegality of content and its terrorist nature is crucial for the implementation of the TCO Regulation and beyond to protect its platform. If an HSP already received a removal order, the assessment as terrorist has already been made by the competent authority. Re-assessment can be useful when the HSP considers challenging or appealing against a removal order when in doubt. If an HSP is formally ‘exposed to terrorist content’, which is the case after it received two or more final removal orders within one year, it must take specific measures to protect its platform from the spread of terrorist content. These measures can include the proactive detection and identification of terrorist content. Importantly, decisions about the terrorist nature of content are not always easy: Oftentimes, they are very contextual and sensitive, and fundamental rights such as freedom of expression must be carefully considered.
3. Establish effective moderation measures
If a piece of content violates any rule, whether legal or platform-specific, as outlined in the ToS, it may be moderated. This means specific measures can be taken to limit its reach. The measure that needs to be applied in case of a removal order is clear: Content moderation always includes the removal of the respective terrorist content – technically, this can also include disabling of access or geo-blocking in the EU. Though, HSPs and users can contest the removal if they disagree. Allowing for legal remedy is also one reason why HSPs need to preserve all content removed for six months securely. If HSPs become aware of terrorist content on their platform before a removal order, they need to inform the EU Member State competent authority affected by it. In case HSPs encounter other forms of harmful (i.e., non-terrorist) content on their platforms, alternative moderation measures may be considered.
4. Appoint contact points and/or legal representatives
The TCO Regulation distinguishes between contact points and legal representatives. Every HSP, regardless of whether they are technically exposed to terrorist content and regardless of their location inside or outside the EU, must establish a contact point. The contact point is responsible for the receipt and swift processing of removal orders. HSPs that do not have their main establishment in the EU need to designate a legal representative. This is a natural or legal person located in the EU who is then responsible for receiving, complying with and enforcing the TCO Regulation.
5. Set up a user notification and complaint system for removed content
After HSPs have removed a user’s content, they must notify the user of this action and, as a control mechanism, provide the user with the opportunity to complain against the removal in an effective, accessible, and user-friendly appeals process. These complaints can be the first step towards a legal remedy. Setting up a user notification and complaint system is, thus, important and can also be used when the platform proactively moderates content outside the scope of the TCO Regulation.
6. Publish transparency reports
If an HSP has taken action against terrorist content, whether required to do so after receiving removal orders or proactively, it must publish annual transparency reports. The transparency report on the measures taken within one year in accordance with the TCO Regulation is due on March 1 of each subsequent year. The report must include certain essential metrics and information, such as the number of removed items that include terrorist content based on removal orders or other measures. Transparency reports are not just an outlet to comply with law but can also help HSPs to demonstrate their responsibility and engagement publicly.
In the course of the EU-project Tech Against Terrorism Europe (TATE), we have created a guide that is intended to help HSPs and IT professionals meet the requirements of the TCO Regulations. Additionally, we have included many practical tips and hands-on advice for those who want to do more to keep their platforms safe from terrorist but also other forms of harmful content (e.g. hate speech, incitement to violence). Please access the full, free guide, including an interactive version and multiple language versions (English, German, French) here.
Sophia Rothut is a Research Associate at the Department of Media and Communication at LMU Munich. She researches online radicalisation, mainstreaming of radical ideas, requirements for countering terrorist and extremist online content, and political / far-right influencers.
Heidi Schulze is a Research Associate at the Department of Media and Communication at LMU Munich. In her research, she focuses on radicalization dynamics online and studies radical/extremist (group) communication in alternative social platforms and fringe communities, as well as characteristics and audiences of hyperpartisan news websites.
Diana Rieger is Full Professor at the Department of Media and Communication at LMU Munich. She conducts research on online radicalisation, hate speech as well as the effects of entertainment content, and develops and evaluates countermeasures against radicalisation.
Catherine Bouko is Associate Professor of Communication and French at Ghent University (Belgium). Her main research is on political communication, extremism, and citizenship on social media, with a special focus on image-based communication.
Brigitte Naderer is a Post-Doctoral Research Associate in the Center for Public Health, Department of Social and Preventive Medicine, Suicide Research and Mental Health Promotion Unit at the Medical University of Vienna. Her research focuses on media literacy, online radicalisation, and media effects on children and adolescents.
Image Credit: Freepik