Yes, something is wrong on the internet. The way we communicate information of consequence on the social web is hopelessly, terribly broken, and the gulf of understanding is only widening as Facebook, Google, and Twitter fight for more scale, more money, more attention, and more data. Their business is not truth, their business is time — and our minds are the collateral damage of this battle.
There is a path out of this mess, but it has to begin with the largest technology companies in the world accepting that their algorithms really don't understand the value of information. More importantly, it means changing the way we think and learn about what information is, how you process it, and how you decide to spread it to other people.
On Wednesday, countless stories were written on the shocking algorithmic mistake that placed conspiracy theory videos at the top of YouTube’s recommendations discrediting the child victims of the Parkland school shooting as “crisis actors.” The concept has been making its way out of the fetid sewer of 4chan and Mike Cernovich-adjacent fake news factories for several days now. Finally, thanks to the engines of commerce provided by mainstream tech, those theories are breaking through to front pages around the internet.
The current top video on YouTube’s trending page is peddling a Parkland school shooting conspiracy theory https://t.co/LsUoYRqvHp— Select All (@selectall) February 21, 2018
Last night, NBC’s head of social media Micah Grimes pointed out the same trend on Facebook, as multiple versions of the crisis actor stories circulated via organic, human sharing to hundreds of thousands of people. Stories that peddled the same blatant lies, misinformation, and misdirection about victims of a violent massacre.
This is how absurd, gaslighting "crisis actor" theories go viral.— Micah Grimes (@MicahGrimes) February 20, 2018
One @facebook post from this person has 111,000+ shares. Another has 23,000.
This is one person, two posts.
Imagine the millions and millions of people crackpot theories like this are reaching and influencing. pic.twitter.com/VU7cKCJhXq
We are in the midst of an information crisis, and no amount of algorithmic tweaks, purges of bots, or surveys of users is going to fix this problem. Social media in its current state is built on the premise that popularity — virality — is the core signal of value in its marketplace. When stories travel fast and are praised widely, the algorithm blesses them up to more and more users. Virus is the right word for this behavior — a sickness that spreads indiscriminately and quickly, with little regard for the health of its host. In this case, the host is our sense of truth, our faith in facts, and our shared narrative about what is happening in the world. The host is very ill. As much as Facebook’s vice president of ads would like you to believe this is just about a few bad apples who are paying to play, it’s actually about the foundation of how their business functions and thrives.
Most of the coverage of Russian meddling involves their attempt to effect the outcome of the 2016 US election. I have seen all of the Russian ads and I can say very definitively that swaying the election was *NOT* the main goal.— Rob Goldman (@robjective) February 17, 2018
This truth used to be protected by human beings — human beings capable of flaws to be sure — but held to account by peers or the public, and largely excellent at collecting and sharing information. Experts and editors have historically made sense of the deluge of information, and shared what they learned with other people. This idea, that specific people are responsible for knowing and sharing what they know, literally forms the basis of our modern world.
One of the earliest promises of the internet was its ability to democratize information. To give anyone the access and audience to share what they know. This is still at the heart of why the internet is valuable, and is an ideal worth protecting. But networks like Facebook devised a way to turn information transfer into something that looks more like a video game — they made the success of “content” not about accuracy, expertise, or truth, but about popularity. About how “clicky” something was.
Jonah Peretti — founder of BuzzFeed and master of the effortlessly viral — put this thinking succinctly in his thesis “Notes on Contagious Media,” on what makes “good” viral content. “A contagious media project should represent the simplest form of an idea,” he wrote, “For the contagious media designer, all that matters is how other people see the work. If people don’t share the work with their friends, it is a failure regardless of the opinion of the creator, critics, or other elites.” What he means is plain: what matters is if they share. Everything else is secondary.
And it is on this kind of flawed and dangerous premise we arrive at our current crossroads — an information landscape increasingly without a center. Without a constant. With no gatekeepers, experts, or editors minding the shop. Simply a self-perpetuating algorithm easily gamed and hopelessly out of sync — a staging ground for Russian troll armies or Gateway Pundit writers with an agenda of misinformation. A highly profitable disaster for the modern age.
When conspiracy theories like the David Hogg "crisis actor" video start trending, most people think "This is bad information and should be removed," but YouTube's algorithm seems to think "Hey, lots of people seem to like conspiracy theories, so we should show them more"— Mathew Ingram (@mathewi) February 21, 2018
The answer to fixing what is broken is simple, but it’s not easy. The answer starts with a much better understanding of these systems and how they reward players. That means media and information literacy education long before anyone looks at a single page on the internet. If you know you’re being gamed — if you can parse what is verifiable and what is not before you share — it could make all the difference. But technology companies also need to get real about the monsters they’ve created and recognize that there is no digital replacement for educated humans who are dedicated to sorting and verifying what stories get passed along.
We’ve tried it their way, and it’s not working.