In June 2019, YouTube unveiled a revamped policy that made hate speech a violation of its terms of service. The company then went about enforcing that policy by instantaneously purging a swarm of videos in violation of the new, more robust rules.

The targeted videos had content and speech that, variously, promoted white supremacy, chauvinism, Holocaust denial, and other hateful topics. This crackdown was meant to chip away at the pervasive presence of “extremist influencers” using YouTube’s platform, especially, to turn young viewers into new converts and normalize white nationalism.

Unintended consequences

This campaign, however, misfired and got rid of far more content than the Silicon Valley giant intended. YouTube inadvertently rooted out thousands of videos uploaded by scholars and civil rights organizations who had intended to educate and warn viewers about ideologies that support hate speech, through their histories and the dangers they pose today.

“YouTube is certainly not leading the pack in taking a stand against hate,” said Keegan Hankes, the interim research director of the Intelligence Project at the Southern Poverty Law Center. “But anytime you have a major technology company outwardly banning neo-Nazism, white supremacy, other bigoted ideologies, and actually making an effort to go out and enforce it, that’s a good thing.”

One of the Center’s videos, specifically an interview with an outspoken Holocaust denier, was among those that YouTube’s purge mysteriously targeted and erroneously removed.

YouTube’s inability to solely target hateful videos illustrates a larger point about the existential struggle facing social media companies in our current era. That is, how to disentangle their platforms from the extremists that have so firmly established themselves in these interconnected yet dispersed communities.

Counterprotesters against right-wing extremists’ “Unite The Right 2” rally, which is for the one-year anniversary of the deadly far-right protests, in Washington, D.C. on Aug. 12, 2018. (Photo by Yasin Ozturk/Anadolu Agency/Getty Images).

A semi-sudden change in direction

With nearly two billion visitors per month, YouTube has some of the highest traffic of any website on the Internet. In the years that followed its birth in 1995, YouTube’s moderators freely removed questionable or offensive videos from the site as often as necessary and with little “pushback.”

The company’s quickly adopted a far more laissez-faire attitude. Like other Silicon Valley companies, YouTube vacillated between the notion of whether their service should facilitate unfettered free speech or vigilantly police content to ensure relative users’ safety and comfort.

Unfettered free speech won out. Virtually any message became permissible so long as it did not directly advocate for real-world violence against any person or group.

Extremists flocked to YouTube to take advantage of this environment. They uploaded content that pushed racially-charged and xenophobic conspiracy theories as well as videos that outright advocated for white supremacism.

The company’s infamous algorithm helped fuel the spread of these ideas. As people watch more and more videos on YouTube, the algorithm tries to find content that will keep them engaged. In practice, this can lead down the proverbial “rabbit hole” to provocative and extreme videos. This was generally regarded as a nuisance, until a few years ago when the ideas from the darker side of that rabbit hole began encroaching on current events.

Within the last few years, violence changed our collective perception of social media (including YouTube) as a benign, or ambivalent, influence on culture. The fact that Dylann Roof murdered nine people at Emanuel African Methodist Episcopal Church in Charleston after an errant Google search result, and that the Christchurch massacre was purposefully live-streamed on Facebook, illustrates the strength of extremist communities operating online.

“The discussion has changed a lot in the last year or two,” said Hankes. “Companies like YouTube have had prohibitions against hate in their policies, they were just almost never enforced. In some ways, they’re taking it a lot more seriously. But, when it comes to enforcing these rules consistently, it’s a very difficult task. That requires, basically, constant maintenance and work. We aren’t there yet.”

YouTube reacts to changing times

In order to act proactively against extremism, YouTube began considering a strengthened anti-hate policy in 2018. The result of that is now available online in the Community Guidelines, and begins with “Hate speech is not allowed on YouTube.”

The protected parties of this policy are wide-ranging. Users will be in violation of making statements that demean vulnerable groups (e.g. women, people of color, victims of violent crime) as well as statements that flagrantly deny the existence of violent acts ranging from school shootings to the Holocaust.

The rule was still in the works in 2019, but an unexpected row between two YouTubers reworked the company’s calculus. In May, conservative influencer Stephen Crowder unleashed a torrent of videos that harassed and attacked Carlos Maza, a video essayist with Vox. Crowder’s criticism quickly turned unnecessarily cruel as he described Maza as, among other epithets, a “lispy queer.”

Maza used his social media channels to pressure YouTube to respond. The platform did (after several, long weeks of handwringing) by removing Crowder’s tools to monetize his account, as well as by implementing the Community Guidelines listed above.

Susan Wojcicki, CEO of YouTube, participates in the CEE Innovator Summit on March 28, 2017, in Warsaw, Poland. (Photo by Karol Serewis/Gallo Images Poland/Getty Images).

The subsequent decision to exercise those guidelines by deleting extremist content is welcomed by civil rights advocates, but that doesn’t totally exonerate the company from a heaping dose of criticism.

“What’s disturbing is that it seems like YouTube, just like so many other social media giants, isn’t learning from these kinds of predictable mistakes,” said Hankes. “I mean, you would hope that they would spend considerable time thinking about what the implications could be for how they implement these policies.”

The difficult path toward a hate-free platform

While YouTube has made efforts to change, so much remains the same. Promoters of hateful ideologies still have videos on the platform, and they will continue to flock there to spread their message for the foreseeable future.

YouTube, furthermore, may be showing a greater will for combatting hateful entities on their website, but their tools to do that are basically the same as before. Finding this content is still reliant on algorithms on the one hand and humans on the other –– that is, users who identify troublesome content and workers who determine whether it is against the terms of service.

Both those methods failed the SPLC, Public Broadcasting Service, University of California, and scores of other groups whose educational videos became collateral damage in the hunt for extremism.

The other component in all this is how entrenched users’ online behaviors have become. Research shows that people often search for political content that confirms their existing points of view. Outreach to get these opinionated people to change their minds can embolden their existing beliefs. As the number of hate groups in America rises year after year, it becomes vital to find a way to permanently interrupt the communication between existing extremists and potential converts to truly disrupt the spread of organized hate.

“It’s all so complex. These companies do not benefit from having extremism on their platform,” Hankes said. “Obviously, many of them benefit from controversy or ‘viralness’ and the clicks and whatnot, but when it comes to outright white supremacist content, it’s not good for the company.”

As for the SPLC and other civil rights champions, YouTube did its best to make things right. It brought the SPLC’s video back online after a few days, alongside much of the other educational content that had been deleted. The online juggernaut has yet to announce their plans for the next steps, but its new Community Guidelines seem here to stay.

“This summer, YouTube made some strides toward trying to tighten up their policies and their prohibitions on extremist content. But, when you’re talking about this large of a company with so much video uploaded all the time, there’s a lot of work to do,” Hankes concluded.