Pressuring Platforms to Censor Content is Wrong Approach to Combatting Terrorism

By Scott Craig and Emma Llansó

The UK government has published a new counter-extremism strategy outlining the steps it intends to take to counter extremist ideologies in British society. In an expansion of earlier efforts designed to prevent people being “drawn into terrorism,” the government now intends to actively challenge non-violent extremism both online and offline. Their core aim is to make sure that ‘extremist propaganda’ does not go uncontested.

The measures, which include providing financial support to organizations that produce effective counter-narratives, the introduction of new Extremism Disruption Orders (EDOs), and a new “cohesive communities” programme, have been met with mixed reactions.

Quilliam, the leading counter-extremism think tank in the UK broadly supported the measures but cautioned that banning or blocking content was not an effective way to tackle extremism. In other quarters, an unlikely alliance of the Christian Institute and National Secular Society has formed to campaign against the strategy which they say will have a chilling effect on free speech if implemented. CDT agrees: the Strategy’s proposals that focus on policing online speech and removing ‘propaganda’ from the Internet are deeply troubling.

Privatizing government censorship

Several years ago, the UK created the Counter-Terrorism Internet Referral Unit (CTIRU), a unit within the police force that seeks to remove ‘extremist’ content from the web by, among other methods, using websites’ content-flagging mechanisms to report content for violating a site’s terms of service (TOS). In July, the Counter-Terrorism Coordinator for the European Union launched an EU-wide version of this program.

The CTIRU has achieved removal of 4,000 pieces of content a month in 2015, “taking the total to 110,000 pieces of propaganda removed” since 2010. But one of the biggest concerns this program raises, from the perspective of freedom of expression, is that it is not at all clear what the government considers “propaganda.”

The UK’s broad definition of “extremism,” which is set down not in statute but in the much-debated PREVENT Strategy is “vocal or active opposition to fundamental British values, including democracy, the rule of law, individual liberty and mutual respect and tolerance of different faiths and beliefs.” This definition has been repeatedly challenged by Members of Parliament for being too broad, which increases the risk of abuse. Further, this definition, and the Counter-Extremism Strategy’s focus on “violent and nonviolent extremism”, goes well beyond the limits of what sort of speech governments may permissibly restrict under international human rights law, raising serious concerns about just what the government is seeking to censor through this program.

As part of their proposals, the government have now called on companies “to strengthen their terms and conditions,” in order to “ensure fewer pieces of extremist material appear online.” However, when Internet companies create their TOS, they consider a wide variety of factors that go well beyond merely what is impermissible under the law. Operators of websites and social networks develop policies based on the type of user-base and community they are seeking to attract and the subject matter of the site. A blog devoted to baking might prohibit any discussion of politics or current events in its TOS, to keep commenters on-topic – a reasonable response for a website operator, but something the government could never do.

In seeking content removal through TOS enforcement, the government is relying on companies’ definitions of terms such as threats, violent or graphic content, malicious speech, and ‘dangerous organizations’, rather than standards set forth in law. Even the most clearly articulated or narrowly drawn of these content policies will tend to go well beyond what government may permissibly censor, and they exist as privately developed codes of conduct, not publicly promulgated regulations that can be challenged in court. Further, a government official seeking to restrict content through a company’s TOS is seeking removal of that content worldwide, exceeding the scope of that official’s jurisdiction.

Although we recognize the difficulty that society as a whole faces in tackling the spread of extremism CDT has a number of concerns with this proposal.

  • Policing of speech online, if undertaken at all, should be grounded in legal frameworks rooted in international human rights, not governed by corporate Terms of Service.
  • Attempting to censor content that does not contravene the law is incompatible with the strategy’s goal of upholding the rule of law. It sends a negative signal to other countries with weaker human rights records considering similar measures.
  • Pressuring private companies to change their terms of service is an improper interference with agreements between private parties. The limits of free speech in the UK must be clearly articulated in law and reviewable by independent courts, not created ad-hoc by private companies under pressure from the government.
  • The goal of “contesting the online space” does not require overbroad efforts to block or remove content. Silencing radical views will not lessen their appeal – and may even increase it. Stifling dissent is no way to challenge extreme ideologies; it only pushes conflict out of sight, and deprives the broader populace of an awareness of the views and tensions that exist within their communities.

There is no doubt that the government is facing challenges from extremist groups and that preserving democracy, the rule of law, and individual liberty ranks among the most serious duties and difficult of challenges faced by government. Governments all over the world are dealing with the same challenge.

Pressuring social media companies and Internet Service Providers to censor content that does not contravene the law not only circumvents Parliament’s authority to determine permissible restrictions on free speech in the UK, but also deprives individuals of their rights to an accountable determination of illegality and to seek redress from an independent arbiter. This complete circumvention of due process sends a negative signal about the government’s commitment to freedom of expression and the rule of law.

The UK is a successful, multi-race, multi-faith democracy that encourages mutual respect and tolerance of different ideas and beliefs, but that position cannot be sustained without outlets for non-violent dissent. It is our belief that the way to promote democratic values is through openness and debate, not through bans and extra-legal censorship. Ultimately, a democratic society’s support for freedom of expression is its most potent argument.

CDT recommends that:

  • The limits of lawful speech should be governed by international human rights standards, not commercial terms of service. Restrictions on speech must be proven necessary by the state to achieve a legitimate aim.
  • Governments seeking removal of unlawful content must do so by procedures that are provided by law, and that ensure accountability and an opportunity for appeal. Such restrictions must be the least restrictive and proportionate means to achieve the government’s aim.
  • It should not be the executive’s prerogative to define and alter the definition of “extremism” without the approval of Parliament. The definition should, like the definition of “terrorism”, be a prescribed term in law.
  • Government should provide transparency about their policies and procedures for removing online content, including information about the nature and scope of information removed. Company transparency reporting can also help to provide more information about content removal requests from governments, though it may not always be apparent to companies when a content removal request originates with the government. This underscores the need for transparency from government directly.

Scott Craig is the Ford-Mozilla Technical Exchange fellow at the Center for Democracy & Technology. He is also the founder of civctech.co.uk & electionalerts.co.uk and a hobbyist front-end developer with a keen interest in web technologies.

Emma Llansó is the Director of CDT’s Free Expression Project, which works to promote law and policy that support users’ free expression rights in the United States and around the world.

This post was first published on Center for Democracy & Technology website on 5 November, 2015. Republished here with the authors’ permission.

Leave a Reply