The Future

Facebook was designed to prey on the fearful and angry

Who’d have guessed.
The Future

Facebook was designed to prey on the fearful and angry

Who’d have guessed.

During the 2016 presidential campaign, Donald Trump’s team was getting bigger, stronger reactions out of his ever-growing voter base and raising more funds on Facebook than Hillary Clinton’s — and Facebook knew it. According to Wired, “Inside Facebook, almost everyone on the executive team wanted Clinton to win; but they knew that Trump was using the platform better. If he was the candidate for Facebook, she was the candidate for LinkedIn.”

Trump’s campaign turned the platform into one of its primary sources of funding by uploading its voter files (which include information such as “the names, addresses, voting history, and any other information it had on potential voters”) onto Facebook in order to gain access to both direct and Lookalike Audience ad-targeting tools. This meant Facebook would be able to identify what general characteristics correspond to someone who would self-identify as a Trump supporter (whether they had explicitly indicated so on the platform or not) and serve them pro-Trump campaign ads.

From Wired:

Facebook identified the broad characteristics of, say, people who had signed up for Trump newsletters or bought Trump hats. That allowed the campaign to send ads to people with similar traits. Trump would post simple messages like “This election is being rigged by the media pushing false and unsubstantiated charges, and outright lies, in order to elect Crooked Hillary!” that got hundreds of thousands of likes, comments, and shares. The money rolled in. Clinton’s wonkier messages, meanwhile, resonated less on the platform.

These targeted ad campaigns were made even more dangerously hyper-specific by the Trump campaign’s partnership with Cambridge Analytica, a data analytics company created by Robert Mercer which specializes in the shady business of psychographic profiling. Meaning, they take information gleaned from your online activity (like Facebook likes, shopping habits, or Google search history) in order to build a personality profile that they can use to individually target users with personally tailored content.

“Instead of tailoring ads according to demographics, they use psychometrics,” writes Sean Illing for Vox. “It’s a simple idea, really. Rather than assuming that all women or African Americans or working-class whites will respond to the same message, they target individual voters with emotionally charged content — in other words, ads designed to tug on emotional biases.”

Cambridge Analytica was able to use psychometric data to determine the effectiveness of the Trump campaign’s social media advertising campaigns and plan Trump’s travel schedule around it. “So, if there was a spike in clicks on an article about immigration in a county in Pennsylvania or Wisconsin, Trump would go there and give an immigration-focused speech.” writes Illing.

In their article for Wired on the “Two Years that Shook Facebook—and the World,” Nicholas Thompson and Fred Vogelstein are disarmingly optimistic about Facebook’s ability to right its many wrongs. “[P]eople who know him say that Zuckerberg has truly been altered in the crucible of the past several months,” Thompson and Vogelstein write. “He has thought deeply; he has reckoned with what happened; and he truly cares that his company fix the problems swirling around it.” But this is far from an easily fixable issue.

Honestly, calling it an “issue” at all seems reductive, since it’s not an isolated problem to be solved. For too long, Facebook systematically supported content that fed into fear and anger (another quote from the Wired piece: “[Advertisers] find 100 or 1,000 people who are angry and afraid and then use Facebook’s tools to advertise to get people into groups… That’s exactly how Facebook was designed to be used.”) Minor tweaks to the timeline won’t fix how Facebook is set up as a platform. Nor will delicately photoshopped images of a beat up Mark Zuckerberg, nor will approaches that rely on people always reaching for empathy and their best judgment. Per Wired’s piece, Facebook, like so many “open” platforms, has long tried to eschew responsibility for what goes on there. It has yet to prove it’s able to anticipate, let alone account for, its worst actors.