The Future

Hiding behind a veil of ignorance

There’s not much soul-searching at Facebook, only self-serving rationalizations.

The Future

Hiding behind a veil of ignorance

There’s not much soul-searching at Facebook, only self-serving rationalizations.
The Future

Hiding behind a veil of ignorance

There’s not much soul-searching at Facebook, only self-serving rationalizations.

In the new ripped-from-the-headlines film Dark Waters, which stars Mark Ruffalo as a corporate attorney turned crusader against the DuPont chemical conglomerate, what ultimately convinces Ruffalo and the audience of the justice of the plaintiffs’ cause is a bunch of corporate white paper. Delivered by the truckload to the Ruffalo lawyer’s office, boxes of subpoenaed documents reveal that DuPont knew it was poisoning the world with the toxic chemicals used to make Teflon coating. But because DuPont was not legally required to share with the EPA what it knew, the company buried its findings. They only became public once Ruffalo’s real-life counterpart, Rob Bilott, dissected them with careful and fanatical care, and sent what he found to the federal government.

Initially, DuPont paid a measly $16.5 million fine. It ultimately settled a $671 million class-action lawsuit, but that only happened almost two decades after Bilott filed his first suit in 1999. The case’s original plaintiff, a farmer, died of cancer likely caused by Teflon-related environmental damage to his home in West Virginia, before the settlement was reached.

Environmental metaphors come naturally to Silicon Valley (e.g. “app ecosystems,” “data is the new oil”), and corporate scandal in the tech industry, particularly Facebook, is where my mind went while watching Dark Waters. Like DuPont, Facebook has developed (or at least owns and operates) a set of services that are lightly regulated and whose core technology is unknown to both everyone outside of it and many people inside. What if we had some kind of documentation that showed what Facebook knew about its products — primarily targeted advertising, which has been used for discriminating against and pushing illegal political propaganda to its users — and when it knew it? And what would the public know that it didn’t know before?

On Tuesday, The New York Times published one such memo, summarizing its conclusion in the following headline: “Don’t Tilt Scales Against Trump, Facebook Executive Warns.” The piece of writing in question was initially published on Dec. 30 on the “internal Facebook page” of Andrew Bosworth, a Facebook vice president who previously ran the company’s advertising business (the most important division of the company, as it is the one that pays for everything else) and now runs its virtual and augmented reality unit. Most relevantly, Bosworth ran Facebook’s ad unit during the 2016 election, and was described by the Times as “viewed by some inside Facebook as a proxy” for Mark Zuckerberg.

Bosworth’s post is less of a missive about what Facebook should or shouldn’t do than as a rumination about what the last few years have been like, and how Bosworth feels they should dictate what Facebook does next. The primary focus of his post, which makes sense given that there’s a presidential election in nine months, is election interference, and for what Facebook is and is not responsible.

In the case of Bosworth — or “Boz,” as he is often called — his new memo describes what Facebook does not know and refuses to learn.

There is no “ah-ha!” moment in Bosworth’s memo, and there’s nothing in it that we did not know before. In fact, the public learned in 2018 just how much Bosworth liked this kind of missive writing for his coworkers, when BuzzFeed News published a 2016 memo in which Bosworth argued that Facebook’s continued growth and data collection were a net positive for the world. In the case of DuPont and (General Motors, ExxonMobil, and Big Tobacco) documents told a story about what corporate executives knew and when they knew it. In the case of Bosworth — or “Boz,” as he is often called — his new memo describes what Facebook does not know and refuses to learn.

Bosworth begins agreeably enough. The Cambridge Analytica scandal, according to him, was a case in the media “where the details are almost all wrong but I think the scrutiny is broadly right.” (Though Facebook did not leak data to CA in the way described by the media, the conditions that allowed CA to perform verboten experiments on Facebook are the fault of Facebook.) Some of what Bosworth says is objectively true, such as his claims that Cambridge Analytica was a pure snake oil outfit and that “fake news” and incendiary headlines from generically grifty and fringe websites were a bigger problem than Russian-sourced misinformation.

But what Bosworth ultimately comes to wrestle with, as the Times headline indicates, is the question of what should Facebook do policy-wise next time around. Should Facebook rework its advertising and content policies to marginalize right-wing politics? “I find myself desperately wanting to pull any lever at my disposal to avoid the same result [as 2016],” Bosworth writes. “So what stays my hand?”

It is here where things get eerie, and what surfaces is a dorm room-quality thought process for someone who works in the control room of one of the most powerful media and advertising companies to have ever existed. Bosworth opens this section by offering the parable from The Lord of the Rings, in which Frodo offers up the One Ring to the elf queen Galadriel, who in turn shuns the offer as she knows the power of the ring would “eventually corrupt her.” He follows this up by citing the work of moral philosopher John Rawls, appropriating his conception of the “Veil of Ignorance,” which asks the reader to consider an action while wearing the titular veil, which obscures their race, age, gender, and so on; Thus they are forced to consider all possible outcomes, and not just the one that they themselves will likely experience. The “Veil,” Bosworth says, prevents him and by extension Facebook “from limiting the reach of publications who have earned their audience, as distasteful as their content may be to me and even to the moral philosophy I hold so dear.”

His reasoning and his message to his coworkers — that we wield enormous power and with it comes great responsibility — skips right over the question that concerns actual critics of Facebook.

Bosworth reveals nothing here about the confidential ways in which Facebook codes its algorithm to come up with what its users see, or what kinds of conversations they have behind closed doors with outlets like their business partners Fox News and Breitbart. But his reasoning and his message to his coworkers — that we wield enormous power and with it comes great responsibility — skips right over the question that concerns actual critics of Facebook, and not just the ones of Bosworth’s imagination: Why should Facebook have this much power in the first place?

The Lord of the Rings comparison is common in Silicon Valley; venture capitalist and Facebook board member Peter Thiel quite literally founded a surveillance company called Palantir, named for an omniscient seeing-stone from the books that, like the One Ring, has the power to corrupt those who use it. In both the case of Palantir and Facebook, executives defend their companies on the grounds that they must wield this power responsibly, because the power, though it has the high potential to corrupt the user, is also used to benefit a lot of people. Where Palantir defends its surveillance practice (and ICE contracts, and so on) on the grounds of “better us than China,” Facebook, and Bosworth especially, have always a version of the logic from a decade-old movie poster — “You don’t get to 500 million friends without making a few enemies” — asserting that some light election interference and other abuse of the platform is bound to happen when you are “connecting” the globe.

Bosworth’s use of the “Veil of Ignorance” is similarly flawed, not because he’s applying it to a situation where it doesn’t fit, but because he reaches a conclusion contrary to what the thought experiment would ordinarily produce. As Matt Bruenig, the founder of the People’s Policy Project (and whose Twitter avatar has, for years, literally been a photo of John Rawls), put it to me:

For instance, behind the VOI, you would not know whether you are a racist publisher or the black person the racist publisher is saying should be killed via genocide. So what decision would you make about whether to allow that content on your platform behind the VOI? Under the usual risk-averse reasoning used for VOI, you would probably conclude the content should be disallowed because you wouldn't want to risk being the black person when the veil lifts.

Bosworth repeatedly invokes his personal politics (“I donated the max to Hillary,” “As a committed liberal”) throughout the memo, which does not lend the imprimatur of someone who is speaking from a place of credibility, but a sweaty kind of special pleading. Before Bosworth concludes his post with a call for Facebook employees to “get ahead” of the next big issues, polarization and algorithmic transparency, he draws a comparison between the consequences of social media usage and eating too much sugar. Rather than compare Facebook to nicotine, Bosworth says, it’s more like sugar, which isn’t good for you but, hey, it’s your choice to consume. “If I want to eat sugar and die an early death that is a valid position. My grandfather took such a stance towards bacon and I admired him for it,” Bosworth writes.

There are probably more DuPont-like, detail-rich documents sitting in Facebook executives’ inboxes, ones that explain how and why its algorithms are weighted in whichever way they are, and which would reveal whether Faceobook works more like “sugar” or “nicotine.” Until then, we are forced to reckon with statements like “what I expect people will find is that the algorithms are primarily exposing the desires of humanity itself, for better or worse” (that’s Bosworth, again), because Facebook has long stopped independent researchers from coming to their own conclusions. Bosworth has a deserved reputation of being the guy at Facebook most likely to put bluntly what much of Facebook’s leadership privately thinks (after all, a number of these memos have been published widely, and Boz still has a job). What that tells us is that Facebook isn’t thinking deeply at all.

Noah Kulwin is the Future Editor of The Outline.