Meet BreadTube, the YouTube Activists Trying to Beat the Far-right at Their Own Game

Want to submit a blog post? Click here.

By Alexander Mitchell Lee

YouTube has gained a reputation for facilitating far-right radicalisation and spreading antisocial ideas.

However, in an interesting twist, the same subversive, comedic, satiric and ironic tactics used by far-right internet figures are now being countered by a group of leftwing YouTubers known as “BreadTube”.

By making videos on the same topics as the far-right, BreadTube videos essentially hijack Youtube’s algorithm by getting recommended to viewers who consume far-right content. BreadTubers want to pop YouTube’s political bubbles to create space for deradicalisation.

Pivot to the (political) left

The name “BreadTube” has its origin in anarcho-socialist book The Conquest of Bread, by Peter Kropotkin. The name emerged organically as a more comedic alternative to the name “LeftTube”, and captures the dissident leftwing nature of the creators it encompasses.

The movement has no clear origin, but many BreadTube channels started in opposition to “anti-SJW” (social justice warrior) content that gained traction in the mid-2010s.

The main figures associated with BreadTube are Natalie Wynn, creator of ContraPoints; Abigail Thorn, creator of Philosophy Tube; Harris Brewis, creator of Hbomberguy; and Lindsay Ellis, creator of a channel named after herself. Originally the label was imposed on these creators, and while they all identify with it to varying degrees, there remains a vibrant debate as to who is part of the movement.

YouTuber Natalie Wynn’s ContraPoints is among the leading channels for BreadTube content.

BreadTubers are united only by a shared interest in combating the far-right online and a willingness to engage with challenging social and political issues. These creators infuse politics with their other interests such as filmsvideo gamespopular culturehistories and philosophy.

The current most popular BreadTuber, Wynn, has described her channel as a “long theatrical response to fascism” — and a part of “the left’s immune system”. In an interview with the New Yorker, Wynn said she wants to create better propaganda than the far-right, with the aim of winning people over rather than just criticising.

Euphemisms, memes and “inside” internet language are also used in a way that traditional media struggle to replicate. The Southern Poverty Law Centre has referenced BreadTubers to help unpack how memes spread among far-right groups, and the difficulty in identifying the line between “trolling” and genuine use of far-right symbols.

BreadTubers use the same titles, descriptions and tags as far-right YouTube personalities, so their content is recommended to the same viewers. In their recent journal article on BreadTube, researchers Dmitry Kuznetsov and Milan Ismangil summed up the strategy thus:

The first layer involves use of search algorithms by BreadTubers to disseminate their videos. The second layer – a kind of affective hijacking – revolves around using a variety of theatrical and didactical styles to convey leftist thought.

What are the results?

The success of BreadTubers has been hard to quantify, although they seem to be gaining significant traction. They receive tens of millions of views a month and have been increasingly referenced in media and academia as a case study in deradicalisation.

For example, The New York Times has reported deeply on the journey of individuals from the far-right to deradicalisation via BreadTube. Further, the r/Breadtube section of Reddit and videos from all BreadTube creators are littered with users describing how they broke away from the far-right.

These anecdotal journeys, while individually unremarkable, collectively demonstrate the success of the movement.

YouTube’s algorithms are a problem

The claim that YouTube helps promote far-right content is both widely accepted and contested.

The central problem in trying to understand which is true is that YouTube’s algorithm is secret. YouTube’s fixation with maximising watch time has meant users are recommended content designed to keep them hooked.

Critics say YouTube has historically had a tendency to recommend increasingly extreme content to the site’s rightwing users. Until recently, mainstream conservatives had a limited presence on YouTube and thus the extreme right was over-represented in rightwing political and social commentary.

At its worst, the YouTube algorithm can allegedly create a personalised radicalisation bubble, recommending only far-right content and even introducing the viewer to content that pushes them further in that direction.

YouTube is aware of these concerns and does tinker with its algorithm. But how effectively it does this has been questioned.

Limitations

Ultimately, BreadTubers identify and discuss, but don’t have the answer to, many of the structural causes of alienation that may be driving far-right recruitment.

Economic inequalitylack of existential purposedistrust in modern media and frustration at politicians are just some of the problems that may have a part to play.

Still, BreadTube may yet be one piece of the puzzle in addressing the problem of far-right content online. Having popular voices that are tuned into internet culture —and which aim to respond to extremist content using the same tone of voice — could be invaluable in turning the tide of far-right radicalisation.


Alexander Mitchell Lee is a PhD candidate at the National Security College studying the relationship between Australia’s Conservative parties, the Liberal and Country party, and Croatian Nationalists during the Cold War. Alex has previously researched Australia’s relationship with Rhodesia and Southern Africa in the 1970s. On Twitter @alexmitchelllee.

This article was originally published on The Conversation, republished here under a Creative Commons license. Photo by NordWood Themes licensed under CC BY 2.0.

Leave a Reply