YouTube Doesn't Know Where Its Own Line Is

As one particularly rough week shows, the video platform's content moderation efforts have become more haphazard and inconsistent than ever.
Image may contain Text and Plot
As one particularly rough week showed, YouTube has struggled on multiple fronts with efforts to effectively moderate its platform.Ben Bours

After the mass shooting at Stoneman Douglas High School in Parkland, Florida in February, far-right conspiracy site InfoWars published a series of videos on YouTube accusing survivor and activist David Hogg of being an actor. In response, YouTube took down several of the videos, and reportedly handed the publication at least one “strike,” for violating its policies on harassment and bullying. After a YouTube channel receives two strikes within the same three-month period, it cannot post any new videos for two weeks. After three strikes, a channel is banned from the platform entirely.

On first glance, it looked like YouTube had leveled meaningful action against one of its most controversial creators. (InfoWars has continued to post videos, indicating it has at most one strike.) But InfoWars has used YouTube to promote lies, hate speech, and false conspiracy theories for years. Videos published by InfoWars that insinuate other school shootings were hoaxes still remain on the platform, while other conspiracy videos from separate channels are being removed. In essence, no one knows what YouTube's rules even are anymore.

While YouTube has a written policy that ostensibly details what it deems acceptable, its recent posture toward InfoWars and other groups shows that it draws those lines inconsistently in practice. At times, the company seems to police videos primarily in response to public outcry, which makes its decisions inherently haphazard and potentially malleable. It’s a gargantuan task to moderate a site to which hundreds of hours of video are uploaded every minute—but it’s even harder if you don’t have clear, consistent rules about what’s allowed.

One But Not The Other

YouTube's Community Guidelines don't explicitly prevent creators from uploading videos featuring conspiracy theories or misleading information. But lately that hasn't stopped YouTube from cracking down on them anyway. This week, The Outline reported that YouTube had begun to issue strikes or deactivate the accounts of creators known for spreading hoaxes, as well several others who post popular gun videos. Those moderation efforts came on the heels of new research showing how widely conspiracy theories get shared on YouTube, and the company's announcement in December that it would staff up its moderation team to 10,000 people.

It's not clear what policy YouTube acted on in many of those cases, and later Bloomberg reported that the company said some of the accounts had been mistakenly removed, though it didn't say which ones. On Thursday, YouTube's moderators appeared to have gone off the rails completely, deleting a bunch of air rifle-related videos from the platform for unclear reasons. But while YouTube has pinned the blame so far on overzealous moderators, the issue seems more likely to stem from unclear direction in the first place. If you want to know how confusing YouTube's policies are, just ask the people charged with enforcing them.

And that's all just this week. The platform's creators have complained for years that the company removes videos or excludes them from advertising revenue for unclear or inconsistent reasons. While YouTube's moderation efforts have gotten more haphazard lately, the platform has long faced criticism for failing to adequately explain them.

"The technology to moderate content on platforms like this should be a scalpel, but YouTube's approach always seems to be more like pouring cauldrons of hot tar from a medieval parapet and hoping for the best," says Matt Wallace, a YouTuber with a modest audience.

Meanwhile, back at the InfoWars account, at least seven videos remain up that say the 2012 Sandy Hook Elementary School shooting in Connecticut was a hoax. The tragedy resulted in the deaths of 28 people—most of whom were under the age of seven. InfoWars’ YouTube videos, with titles like “Bombshell: Sandy Hook Massacre Was a DHS Illusion Says School Safety Expert,” and “Why People Think Sandy Hook is a Hoax” imply that the shooting was an event staged by the federal government.

The Sandy Hook videos underscore another problem with how YouTube polices content: its strike system. YouTube bans accounts if they receive three strikes within a three-month period. If InfoWars had received, say, two strikes for the Sandy Hook videos published four years ago, it still could have gone on to upload the David Hogg ones this month without a ban. Some leniency makes sense, to allow for honest mistakes. But the current system could also be easily gamed to break the rules continuously, just not too frequently.

Change of Policy

YouTube also sometimes flip-flops. Again just this week, the company initially defended its decision to permit Neo-Nazi group Atomwaffen Division to publish videos on its platform to The Daily Beast: “We announced last June that we would be taking a tougher stance on videos that are borderline against our policies on hate speech and violent extremism by putting them behind a warning interstitial and removing certain features, such as recommended videos and likes,” a YouTube spokesperson told the publication at the time. “We believe this approach strikes a good balance between allowing free expression and limiting affected videos' ability to be widely promoted on YouTube.”

But when Motherboard brought the hate group’s videos to the attention of the Anti-Defamation League, the organization said they should be deleted "immediately." On Wednesday, YouTube terminated Atomwaffen’s account for hate speech, contradicting what it had said to the Daily Beast not two days earlier.

It's troubling that it took such intense public pressure for the platform to decide to remove a group believed to be connected to at least five murders, according to a ProPublica report from last week. Even though Atomwaffen’s account has been terminated, it’s still fairly easy to find other videos on YouTube promoting the group. Moderating a sprawling internet platform consistently takes more than just removing a single account brought to a company's attention by a news report.

In this case, it's a positive that public outrage led to the removal of a hate group's videos. But public pressure can come in all forms and point at any target—it can't be what the platform relies on to make hard decisions.

Inconsistent Action

You don't have to look far for other examples of YouTube's inconsistent moderation. In the wake of the Parkland shooting, the company initially allowed a conspiracy theory video about the massacre to trend on the platform and gain over 150,000 views. In January, YouTube also came under fire, this time for failing to remove a video from Logan Paul—one of its most popular creators—featuring a dead body hanging from a tree. Paul later removed the video himself. And in November, under intense media pressure, the company banned an entire sub-industry of accounts that created exploitative videos of children.

YouTube has taken steps to fix its platform in the wake of each of these incidents, including the most recent. If you search for David Hogg on YouTube now, you're also mostly presented with legitimate news clips, indicating the platform has actively changed its algorithm to de-promote conspiracies. But the pattern of waiting until something is brought to YouTube's attention to do anything meaningful will likely continue. YouTube did not respond to multiple requests for comment.

Reporters covering technology companies are stuck in are stuck in a cycle, as others have noted, where part of the job involves doing a platform’s content moderation work for them. For over a year, it has felt like half of tech coverage consists of journalists merely pointing out that sites like YouTube, Facebook, or Twitter are still hosting hate groups, revenge porn, and people inciting violence.

There are no easy answers for how to move past it, but greater transparency would be a good start. If YouTube started showing users exactly how it moderated videos, we’d at least have clearer answers about where the the line is. Right now, it looks like YouTube itself doesn't even know.

Everything in Moderation