By Suraj Lakhani
There has been, especially of late, increasing concern over the misuse of video-games and associated (adjacent) platforms (e.g. Steam, Discord, Twitch, DLive) for the purposes of extremism and violent extremism. Across the European Union (and for that matter globally), policymakers, law enforcement, academics, and counter-extremism practitioners have started to engage more on the topic. In a recent Radicalisation Awareness Network paper, it was suggested that extremists and terrorists, who are often pioneers in the digital space, are afforded new opportunities through gaming and associated (adjacent) platforms. These individuals ‘have introduced innovations faster than we have been able to respond, and as a result, have grown their digital advantage’. Most notably in Europe, there has been particular concern over the digital recruitment tactics of far-right (violent) extremists.
There are fears that online video-games and associated (adjacent) platforms can be used to disseminate digital propaganda, and for purposes of radicalisation and recruitment. However, the relationship between radicalisation, recruitment, and gaming is often complicated, with current literature challenging whether these outcomes are (violent) extremists’ primary intentions, with, instead, reinforcing beliefs, community building and strengthening, and developing a more robust online ecosystem appearing to hold greater prominence. This blog is based on a recent paper undertaken by the author for the European Commission’s Radicalisation Awareness Network Policy Support (currently in press).
Radicalisation, Recruitment, and Reinforcing Views
The question of whether video-gaming or associated (adjacent) platforms are deliberately used for purposes of radicalisation and recruitment is predictably complicated and often contradictory in nature. Emerging research argues there is limited evidence to demonstrate radicalisation and recruitment were part of a concerted strategy of (violent) extremists on associated (adjacent) platforms; rather, these spaces were seen to provide wider functions and act as part of ‘the broader social activity of individual users’. This alludes to a consideration that is rarely mentioned in regard to violent extremism and video-gaming; that, as well as some violent extremists and their organisations purposefully targeting video-gaming spaces, numerous others who hold similar ideologies and beliefs will be gamers themselves, using video-gaming in the same way many others do, i.e., to have fun, socialise, and develop communities.
This does not mean radicalisation and recruitment are not present or a purposeful strategy for some (violent) extremists on these platforms. In fact, a recent paper from the European Union Counter-Terrorism Coordinator argues that online gaming platforms could, in future, replace traditional social media platforms as the preferred mode for propaganda dissemination and recruitment. The argument here intends to demonstrate a number of wider considerations, with two in particular. First, that empirical research is extremely limited in this area, where it is currently difficult to determine the scale of radicalisation and recruitment on these platforms. This could be particularly relevant for in-game voice and text chat. Second, that we need to consider the wider functions of these platforms, including – as discussed in relation to bespoke and modified games – to reinforce and normalise the beliefs and motivations of those who are already empathetic or attuned to the organisations’ messages, or already part of a movement.
Community Building and Strengthening
Another function of online gaming and associated (adjacent) platforms can be the community building and strengthening aspect around mutual interests (including gaming and extremism-related narratives). Although recent research has demonstrated similar findings, what is particularly concerning is evidence of the interlinking of non-violent and violent forms of far-right extremism on these platforms. This has also been demonstrated in the wider (non-gaming) literature which points to the overlapping and collaboration (sometimes internationally) of previously distinct far-right extremist communities.
In some instances, within recent work, it was determined that various communities on these platforms attempted to verify users’ extremist affiliations (ideologically or otherwise) before they were permitted to join their servers/online spaces. This, once again, demonstrates the complexity of the issue in terms of radicalisation and recruitment. If groups attempt to ‘verify’ users before being admitted into the group, this suggests that radicalisation and recruitment functions are minimised due to the members already having some identification with ideologies. Other communities were open, which could attract a wider range of users; though, once again, this does not automatically assume that radicalisation and recruitment are the primary functions; nor does it overlook the possibility of this occurring.
Extremist Online Ecosystem
Finally, it is important to note that these video-game hosting and complimentary (adjacent) platforms do not work in silos, but are used as part of a wider online ecosystem by (violent) extremists who, in similar ways to the general population, use a plethora of online spaces for different purposes, to reach wider audiences, and to ensure content and accounts remains active during attempts at disruption. As a result, (violent) extremists adopt flexible, multi-platform approaches, where they can use and shift across several online spaces in order to maintain their presence. There are also widespread concerns that video-gaming can be an entry point where, once trust is established, there is the possibility that recruiters are able to guide people to alternative, less monitored, spaces, via communication tools embedded in video-games.
Generally, the issues outlined within this blog have been experienced by traditional social media platforms for many years. Although there is still much work to do, this has led to various actions across the industry, including collaboration to exchange knowledge, experience, and good practice, either informally or though initiatives like the Global Internet Forum to Counter Terrorism (GIFCT). Further, GIFCT has provided opportunity for these social media platforms to share hash-type databases which can help to identify (violent) extremist content that has already been, or attempted to have been, uploaded to other online spaces. In addition, through initiatives like Tech Against Terrorism, the support of smaller and emerging platforms has been undertaken. This has provided opportunities to co-produce solutions to some of the challenging problems faced by these companies. It is suggested that video-game producers and associated (adjacent) platforms consider similar approaches to collaboration across the industry and wider ecosystem.
Dr Suraj Lakhani is a Senior Lecturer within the School of Law, Politics & Sociology at the University of Sussex. Suraj engages in research on terrorism, (violent) extremism, online extremism/terrorism, and counter-terrorism policy. He is an Associate Fellow at the Royal United Services Institute, and a Research Fellow at VOX-Pol. On Twitter @surajlakhani.
Image credit: Pexels.