The Future

InfoWars is the hill that tech companies are choosing to die on

How is Alex Jones still on YouTube and Facebook?

The Future

InfoWars is the hill that tech companies are choosing to die on

How is Alex Jones still on YouTube and Facebook?
The Future

InfoWars is the hill that tech companies are choosing to die on

How is Alex Jones still on YouTube and Facebook?

Despite the fact that its sole focus is the spewing of conspiratorial nonsense topped with a thick layer of unbridled rage and supplement sponcon, Alex Jones’ InfoWars is still kicking, thanks to an inane YouTube policy. In February, the channel received a strike for violating YouTube’s policy on “harassment and bullying” by posting a video alleging the Parkland shooting survivors were crisis actors. A few days later, it was issued a second strike in response to a video about Parkland, and was temporarily locked. On Tuesday, it received its third, this time in response to four seperate videos that were in clear violation of YouTube’s policies on hate speech and child endangerment. Yet the channel wasn’t taken down.

This, of course, stands in stark contrast to the bold stances against hate speech and general depravity made by both Facebook and YouTube top brass over the years. But therein lies the answer: It’s a lot easier to talk about having strong moral standards than it is to actually enforce them.

The finite unit of the YouTube’s content moderation strategy is the “warning strike,” for when a channel commits some cardinal sin like posting extremely graphic content, inciting violence, threatening someone, or copyright infringement. Channels lose privileges as they gain strikes: One strike removes the offending video and deactivates some features, like the ability to live stream, two bans the content creator from uploading videos for two weeks, and three equals YouTube’s version of being “outta here,” i.e. account termination.

The catch is that all of this has to happen within a three month period, which includes the two weeks a channel would be frozen after a second strike. If multiple videos are found to be in violation of the rules all at once, they’re rolled into one strike. YouTube’s strikes expire every 90 days, because “We” — meaning, YouTube — “understand that users make mistakes and don’t intend to violate our policies — that’s why strikes don’t last forever.”

To make matters worse, this inane 90 day auto-expiration policy only applies to “Community Guidelines strikes” (which are issued for videos that contain hate speech, violence, threats, or some other form of harmful content), not copyright strikes, which can often hit smaller creators in a flurry. To get a copyright strike expunged from your channel’s record, users have to complete “Copyright School.” That YouTube has such a system in place for people for using “Wala Cam” without permission, and not even an attempt at course correction for racists, child abusers, and hate-mongers shows where the company’s priorities really lie.

Multiple violations shouldn’t be discounted and rolled into one strike solely because it took a YouTube mod a couple of weeks to look critically at the wealth of disturbing content piling up on one account. Once a ‘mistake’ like this has been made three times in a row, it ceases to be a mistake, especially when it’s ‘harassing children on camera to get more views on YouTube’ or ‘calling the survivors of a mass shooting crisis actors.’

YouTube’s policies have been inconsistently applied, but it has worked somewhat better than Facebook’s nonexistent one. On Wednesday, Facebook ruled that Alex Jones’ disturbing rant against Robert Mueller — during which he accused the special counsel of pedophillia and child rape and pantomimed shooting him in the head — adhered to Facebook’s community guidelines. Facebook’s Monika Bickert confirmed that it doesn’t have a strong content stance during last week’s House Judiciary Committee’s hearing on social media filtering practices, as did Mark Zuckerberg during his now-notorious interview with Recode’s Kara Swisher:

It’s hard to impugn intent and to understand the intent. I just think, as abhorrent as some of those examples are, I think the reality is also that I get things wrong when I speak publicly. I’m sure you do. I’m sure a lot of leaders and public figures we respect do too, and I just don’t think that it is the right thing to say, “We’re going to take someone off the platform if they get things wrong, even multiple times.”

What we will do is we’ll say, “Okay, you have your page, and if you’re not trying to organize harm against someone, or attacking someone, then you can put up that content on your page, even if people might disagree with it or find it offensive.” But that doesn’t mean that we have a responsibility to make it widely distributed in News Feed…

The tech companies have so far faced little repercussion for failing to regulate their platforms over the last decade-plus, which is why they’ve done so little. But as this unrest starts to impact their bottom line, their gestures at cleaning up may not be enough.