Politics

YouTube is leaving its creators in the dark

The site is mass delisting videos from its ad program, spreading the sense of persecution in communities from LGBT to the alt-right.

Politics

YouTube is leaving its creators in the dark

The site is mass delisting videos from its ad program, spreading the sense of persecution in communities from LGBT to the alt-right.
Politics

YouTube is leaving its creators in the dark

The site is mass delisting videos from its ad program, spreading the sense of persecution in communities from LGBT to the alt-right.

The automated system that screens YouTube videos for things advertisers want to stay away from has been on a tear lately, stripping popular creators of their revenue streams and causing confusion, chaos, and the paranoid thought: Why is YouTube targeting me?

“Welp it's begun. @youtube finally started demonetizing any and all LGBTQIA content on our channel,” tweeted Gaby Dunn, a queer comedian who runs the channel Gaby & Allison, which often talks about sex that is other than hetero. Even some videos that were not sexual at all were flagged as “inappropriate for advertisers,” removing ads and cutting off revenue to creators, she said, and showing an anti-queer bias.

Meanwhile, conservative and alt-right vloggers suspected YouTube was censoring them for their political views. “YouTube’s ‘Diamond and Silk’ Believe They Were Demonetized for Trump Support,” read one Breitbart headline. “I think YouTube is furious that so many conservative channels have gotten so popular in the last year, and they don’t want us to be able to work full-time doing what we’re doing because our message is at odds with almost everything that Google and YouTube’s leadership stands for,” vlogger Mark Dice told The Daily Caller. “YOUTUBE DECLARES WAR ON POLITICALLY INCORRECT OPINIONS,” wrote Infowars.

Many of YouTube’s top creators took it especially personally, posting videos about how YouTube had grown on the strength of its popular vloggers only to cast them aside. “By taking away monetization, it is a form of censorship,” vlogger Philip DeFranco said.

YouTube has been demonetizing videos in baffling ways for a long time, but a new notification system is making creators fully aware when it happens. LGBT channels that discuss relationships and sexuality are seeing queer content demonetized and straight content left alone; video game channels are finding flags on videos of animated card games but not on violent shooter videos; and evangelical Christians are finding their entire channels suddenly de-funded.

Some YouTubers have even reported that YouTube appears to be censoring their videos based on negative comments, rather than the video content. While this makes sense given that sometimes comments can reveal explicitly what is said implicitly or dog whistled in the video, it’s also richly ironic given that YouTube has never bothered to address the issue of its notoriously toxic and hurtful comments.

But what feels like political oppression to all of these communities is in fact something much more mundane: YouTube is penalizing everyone.

The tension between YouTube’s loyalty to advertisers versus its loyalty to creators has been percolating for years. YouTube runs ads before videos according to a labyrinthine auction system that more or less separates advertisers from the content they’re sponsoring. Instead of buying ad space on a specific program as they would have in the TV era, YouTube advertisers target viewers directly: whatever video a white male between the ages of 18 and 35 chooses to click on, a video game company wants to show him their ad. Unfortunately, that 22-year-old white male might be clicking on an ISIS propaganda video, and large corporate advertisers don’t want to see their ads anywhere near content like that.

When users see flags affecting themselves and their favorite channels, the internet’s persecution complex takes over.

In March, YouTube tried to please advertisers with an update to Restricted Mode, an optional setting that has been around in some form since 2010. If parents or schools turn on Restricted Mode, YouTube videos flagged as inappropriate would be unavailable to those users, lowering traffic numbers and cutting off creators from potential income. Unfortunately, YouTube’s ham-fisted filtering mechanism labeled LGBT content as restricted, even flagging a video of two women exchanging wedding vows. YouTube immediately walked back the filter to re-tool and try again.

The newest approach began on August 7 with a new tool helping creators appeal decisions made by the automated system. By August 8, the first reports of bizarre flagging began. One YouTuber wrote on Reddit that a video of a swarm of dragonflies, with no talking at all, was flagged for demonetization; one anonymous vlogger said all of their channel’s videos were flagged for demonetization at once, but 90 percent were restored after appealing. These are anecdotal, but any automated system with a failure rate anywhere near 90 percent is clearly not ready to roll out.

Still, the roll-out has been steady, with more new channels and creators speaking up about having their channels demonetized around the end of August and early September. Already feeling burned by YouTube from March, the LGBT community has been outspoken about the changes. But other communities are feeling attacked, as well. YouTube’s filter is on the lookout for “hateful content,” so conservative religious channels that have videos arguing against opposing religious viewpoints — or even videos explaining the spread of ISIS and militant religious extremists — have been demonetized as well. A conservative “Mommy blogger” and a radio host both lost their YouTube revenue streams, prompting the Christian Post to ask, ���Is YouTube Targeting Christians and Conservatives?”

Arielle Jane, a small-audience YouTuber from the Midwest, uses her account as a video diary and as a platform to talk about things that are important to her. Her channel is small at 1,100 subscribers, and she is proud to be part of the so-called “smalltuber” community — ”YouTube is currently just providing a bit of extra pocket change,” she says — but her audience is growing fast.

In August, two videos she tagged with LGBT+ categories were automatically flagged by YouTube’s content control filter. “I did a video answering questions for the LGBT+ Smalltuber Tag,” she says. “It was arguably the most innocent, wholesome video I've ever uploaded, talking about my experiences with being LGBT+ and offering advice to fellow members of the community on their own coming out journey.” The video and a follow-up video were both flagged by YouTube instantly, so Arielle is certain that her choice of LGBT-focused category tags caused the problem.

Arielle filed an appeal, but came across a new snag: YouTube only reviews appeals from videos that get more than 1,000 views over a seven-day period or from channels with more than 10,000 subscribers. These small videos are losing out on a negligible amount of income, but the message to creators with small audiences is clear: YouTube is making mistakes, but small videos aren’t worth YouTube’s time to get it right. According to YouTube, unless her video got a serious traffic bump, her appeal would simply never be reviewed. No one from YouTube ever responded to her, but after a couple of weeks YouTube corrected its error and returned her two videos to normal. She was relieved, but confused — it seemed like someone at YouTube granted her appeal, despite its written policy that her appeal would never even be seen.

YouTube’s opaque systems have led to an information vacuum, and in the absence of accurate information, tribalism and echo chambers take over. The intent of the demonetization algorithm is clear, but it is badly flawed — when users see flags affecting themselves and their favorite channels, the internet’s persecution complex takes over. It’s not hard to find people loudly convinced that YouTube hates them. It’s so obvious, they say, that YouTube wants to silence main-stream Republicans, politically incorrect commentators, transgender people, conspiracy theorists, feminists, minorities, queer people in general, horror filmmakers, journalists, and America itself.

The technology we’ve come to rely on, whether it is software we use on our phones or social media platforms we use on the internet, is a black box. Code is usually proprietary, and even if we could see it, most of us couldn’t understand it. We input our lives, mysterious forces act on them, and the only output we can see is a reflection of our own paranoia and fear. It’s the same dynamic that has led to a gut-level rejection of “fake news,” a fracturing of reality into smaller and smaller echo-chambers governed by their own alternative facts. The only way to solve a problem like a black box is transparency, a tactic that massive tech companies have historically been unwilling to handle.

YouTube did not respond to a request for comment.

Culture

The nipple still isn’t free

The movement to loosen Instagram’s nudity policy highlights the difficulty of enacting social media change when the target is the platform.
Read More