Fanning the Flames: Google/YouTube - ACRE Institute

Big Tech Platforms: Providing Safe Spaces for Unsafe Actors

Google-owned YouTube plays a key role in exposing young people to white supremacist ideology and anti-Muslim propaganda. In particular, the ‘autoplay’ function of YouTube’s recommendation algorithm “promotes, recommends, and disseminates videos in a manner that appears to constantly up the stakes,” according to sociologist Zeynep Tufekci, who has studied YouTube. For example, searching for the word “refugees” on YouTube, leads users to the channels of prominent anti-Muslim influencers. This is a profit-driven mechanism since YouTube is owned by Google, which makes its money off advertising. The longer users stay on YouTube, the more money Google makes. Tufekci sums up the situation as: “YouTube leads viewers down a rabbit hole of extremism, while Google racks up the ad sales.”

 

GOOGLE

YouTube, which is owned by Google, also plays a key role in spreading far-right ideology. Sociologist Zeynep Tufekci wrote in the New York Times that when she was watching videos of Trump’s campaign rallies in 2016 for an article she was working on, she noticed that “YouTube started to recommend and ‘autoplay’ videos for me that featured white supremacist rants, Holocaust denials, and other disturbing content.” She found a similar pattern as she started watching videos about other issues. Regardless of the subject of the video she first started watching, she found that YouTube’s recommendation algorithm “promotes, recommends, and disseminates videos in a manner that appears to constantly up the stakes.”

Tufekci noted that because YouTube is owned by Google and Google makes its money off advertising, “[t]he longer people stay on YouTube, the more money Google makes.” Furthermore, she wrote that YouTube’s recommendation algorithm “seems to have concluded that people are drawn to content that is more extreme than what they started with–or to incendiary content in general.” She concluded, “YouTube leads viewers down a rabbit hole of extremism, while Google racks up the ad sales.”

Instead of rooting out anti- Muslim content, Facebook tends to give special protections to far-right content if it is posted by people or pages with a large number of followers.

This phenomenon has been well-documented, and far right and white supremacist YouTube influencers in particular have figured out how to exploit it. A 2018 report by Data and Society, called Alternative Influence: Broadcasting the Reactionary Right on YouTube, shows how a network of 65 political influencers, who together have millions of followers, game YouTube’s algorithms to “build audiences and ‘sell’ them far-right ideology.” In fact, the Daily Beast has called YouTube “a readymade radicalization network for the far right.”

YouTube has been used to spread anti-Muslim bigotry. Researchers noted in a 2018 study that searching for “refugees” on YouTube brought up videos from anti-Muslim YouTube influencers. The day after the Christchurch shootings, the Huffington Post reported that YouTube’s recommendations algorithm was still promoting anti-Muslim videos:

Just hours after the attack, a search for the term “Muslims” from an incognito browser yielded a list of YouTube’s top-recommended videos, including one with 3.7 million views. The video argued, without evidence, that the majority of Muslims are radicalized. From there, YouTube’s autoplay function took over and recommended another round of videos. One apparently exposes “the truth” about Muslims. Another “destroys Islam.”

The Daily Beast interviewed a series of men who say they were led into far-right movements by watching YouTube videos at a young age but later renounced those views. One of the men described how watching a video of Bill Maher and Ben Affleck discussing Islam had brought up recommendations for “a more extreme video about Islam by Infowars employee and conspiracy theorist Paul Joseph Watson.” Once he found Watson’s YouTube channel, he was then able to discover other YouTube personalities who shared Watson’s far-right views. He said, “I think YouTube certainly played a role in my shift to the right because through the recommendations I got, it led to me to discover other content that was very much right of center, and this only got progressively worse over time, leading me to discover more sinister content.”

In June 2019, YouTube unveiled new policies banning white supremacist content and videos that distort or deny events like the Holocaust or the Sandy Hook shooting. However, critics pointed out that even before the change, YouTube already had anti-hate and anti-harassment policies in place that it had failed to enforce, leading them to question how well it will enforce these new policies in the future. Two months later, BuzzFeed News reported that although YouTube had removed the channel of a prominent white nationalist named Martin Sellner from its platform following the implementation of the new policy, the corporation then reinstated the channel the same week, saying that the decision to move it was the “wrong call”. Sellner had links to the Christchurch shooter. According to BuzzFeed News, “Before the massacre, Sellner had repeated contact with him and reportedly sent him a link to his YouTube channel.”