In the aftermath of the 2016 presidential election, the investigation of Russian “influence” operations in the US revealed a mess of corruption and stupidity: Maria Butina’s attempted infiltration of the NRA, the hacking of John Podesta’s and Democratic National Committee’s emails, and much more. One of the scandal’s spokes that gained the most traction in the months after the election concerned social media, and the activity of the Internet Research Agency, a Moscow organization long suspected of using paid internet trolls to fuck shit up in foreign countries.
This last scandal prompted a great deal of scrutiny on Facebook, in particular, as Facebook is where the greatest number of Russia-sourced hijinks were believed to have occurred. By the end of October 2017, Facebook conceded that 126 million people had seen “inflammatory” posts sourced by Russian agents (a revision upwards from 10 million people Facebook had claimed weeks earlier). In time, this story was fused to a broader narrative about Donald Trump operating, wittingly or, more likely, not, as an agent of the Kremlin. After all, Russia had been spurned from NATO, invaded Ukraine, been sanctioned by the US, put its own troops on the ground in Syria, symbols that it was nursing global ambitions of its own. If you were looking for an explanation of how Trump became president with the assistance of an age-old, too-long-ignored enemy of America, MSNBC gave you a pretty good idea of who to blame.
Because we are doomed to repeat the 2016 election in small ways, every single day until we all die, there is a new development in this story. Researchers from Stanford University, working with Facebook, today identified “three Russian-backed influence networks on its site that were aimed at African countries including Mozambique, Cameroon, Sudan and Libya,” per The New York Times. The “networks” — a combination of personal accounts, propagandistic websites dressed up to look like news, and specific Pages with large followings — were tied to Russian entities like the Wagner Group, a private military contractor with a sprawling presence in a number of African countries. These pages and posts have now been removed from Facebook, which says that the networks purchased $87,000 worth of ads and that (this is purportedly the real innovation here) the accounts were primarily operated through local proxies rather than by people sitting in an office in Moscow. Facebook’s former security chief Alex Stamos told the Times that it was safe to assume that the Russian government and its allies will be trying something similar in the US for the 2020 election.
The common denominator deserving of scrutiny here is not Russia. If the collapse of the Soviet Union marked a humanitarian catastrophe (between 1991 and 1994, life expectancy in Russia fell by five years), it was accompanied by the end of an ideological confrontation that for almost fifty years had allowed the possibility of global nuclear annihilation. These circumstances have not rerisen, and as the historian Tony Wood wrote in the London Review of Books, “the notion of a ‘cold war’ is a kind of geopolitical speech act: if enough people in power decide they are in one, it will materialize.”
The takeaway from the findings published today should not be that the people in power must take us back to the brink. What is instead the relevant similarity is the platform on which all this mischief has occurred: Facebook. Though Facebook is not alone in this (YouTube was implicated on a much smaller scale in 2017), the social network’s fundamental opacity, business model, and scale are what allow such operations to flourish.
For Facebook to make money at the level at which it and its investors want it do, the company operates an ad distribution network that can accommodate its seven million advertisers. Much like the overburdened content moderators who keep track of the things that Facebook users post, Facebook simply operates at too large a level and a too algorithmically obscured an ecosystem (each person’s feed is unique to them) to prevent propagandistic or disinformation campaigns from taking place. And if you’re a politician, Facebook will run your ads even if it knows you are lying, anyway.
While the American government does not possess the power to instruct the Russian government as to what it can or cannot do in other countries, we do, however, have the ability to control what Facebook does. It is, after all, an American company. There’s rising political pressure on Facebook, in part because of scandals like these ones, and an increasingly loud chorus of individuals who think that the government must either more tightly regulate Facebook or break it up altogether.
This is where a more productive path forward lies. The other policy prescription requires people to believe that Facebook is simply another American company whose products and services are being abused by nefarious Russians. Given that Facebook seems to be prepared to die on any hill to protect its status quo structure and its status quo profits (again, the company has quite literally decided that false political advertising is fair game), it’s understandable why Facebook might like this story. But for nearly everyone else, who would suffer under a return to pointless brinkmanship, it would be a disaster.