The Future

Don’t let anyone tell you platforms can’t be regulated

We have strict, proactive regulations for lots of industries, and we can definitely get a handle on Facebook and everything else.

The Future

Don’t let anyone tell you platforms can’t be regulated

We have strict, proactive regulations for lots of industries, and we can definitely get a handle on Facebook and everything else.
The Future

Don’t let anyone tell you platforms can’t be regulated

We have strict, proactive regulations for lots of industries, and we can definitely get a handle on Facebook and everything else.

Hidden away in the “Settings” section of my Motorola smartphone is a list of legal disclaimers and regulatory information so extensive as to be unintentionally funny. Among other things, it promises that the device’s antenna will adhere to arcane guidelines about radio interference established by the Federal Communications Commission, as well as equivalent laws in Canada and the European Union. Dig deep enough, and it warns about far-fetched scenarios ranging from the handset causing an allergic reaction, to shattering and cutting me, to inadvertently setting off explosive devices.

Absent from those many disclaimers — or perhaps worded so obliquely that I couldn’t spot it — is a far more obvious danger: that it will be used to collect personal data that shadowy entities will use against me in various nefarious ways. The unfolding scandal of Cambridge Analytica, a Trump-affiliated political group that was found this week to have obtained detailed information about tens of millions of Facebook users without their knowledge or consent, has attracted so much negative attention that the social giant’s stock price fell by more than $9 billion.

That’s the strange thing about the internet: unlike automakers, which are required to include seatbelts and airbags, or restaurants, which must follow stringent health requirements, there’s a sense that technology companies can do whatever they want, regardless of how harmful. Now, for the first time the idea’s had any political plausibility, experts are starting to beat the drums for meaningful regulation of big data at a large scale. “I don't know that we shouldn't be regulated,” Mark Zuckerberg said in an interview with CNN Wednesday night; we couldn't agree more.

In a Guardian op-ed, representatives of the Open Markets Institute recommended nine steps the Federal Trade Commission could take to restructure Facebook, from forcing the company to enact transparent standards to spinning off Instagram, WhatsApp and the corporation’s ad network.

Those would be drastic measures — and untenable in today’s political climate — but that they’re being suggested shows a change in how we see big data. Its threat, when it’s been demonstrably used to harm society, no longer seems abstract; it seems more like lead paint, or an industrial byproduct, than an abstraction of the ad industry.

Writing for Bloomberg Businessweek, Paul Ford even suggested that the United States created something like the Environmental Protection Agency but for the internet — a Digital Protection Agency that would “clean up toxic data spills, educate the public, and calibrate and levy fines.”

But, as Ford acknowledged, it’s difficult to imagine the federal government summoning the political will to create such an entity when the most explosive breaches of public trust, like Cambridge Analytica, are tightly tied to partisan issues like interference in the 2016 presidential election.

It’s also possible that the European Union’s stringent new data protection laws, which will come into effect in May, could play a role in cleaning up the data industry. David Carroll, a professor at Parsons School of Design, is suing Cambridge Analytica’s parent organization SCL Group for a full copy of the file it assembled about him using data gleaned from Facebook — an effort, in effect, to use Europe’s stronger consumer protection laws to hold powerful companies accountable elsewhere in the world.

Failing legal recourse, it’s tempting to imagine a cultural shift in the data industry. Yonatan Zunger, a former Googler, suggested on Twitter that the Cambridge Analytica scandal could act as a moment of historical reckoning for computer science itself. Chemistry experienced a similar moment, he argued, after the inventions of dynamite and chemical weapons, as did physics after the atomic bomb. According to Zunger, it’ll be difficult to trust big tech until software engineers start to grapple with the ethical implications of their work.

But waiting for the tech industry to grow a conscience would likely be a long road. “Young engineers treat ethics as a speciality, something you don't really need to worry about; you just need to learn to code, change the world, disrupt something,” Zunger wrote.

Facebook founder Mark Zuckerberg pledged this week to prevent similar breaches in the future. But in reality, Cambridge Analytica’s data grab was just a particularly visible example of a phenomenon so common that we scarcely notice it. The $80 billion online advertising industry, which is dominated by Google and Facebook, has assembled vast troves of information about nearly every internet user — sometimes from tidbits we upload ourselves on social media, but also inferred from our browsing histories, financial data and the people we interact with.

Though that system forms the economic backbone of the modern internet by serving users with tightly targeted advertisements, less than a third of Americans say they understand how companies share their personal information. That’s striking, in an era of consumer protection in which alcohol, cigarettes and even smartphones come packaged with dire warnings about health and safety — and some experts are starting to beat the drum for some form of universal disclosure for data collection.

What’s certain is that Facebook has shown time and again that it can’t be trusted to regulate itself. This week, a former Facebook employee recalled in the Washington Post how in 2011 he discovered that an app called Klout was creating “ghost” profiles of users’ children. When he pressed them on the issue, Klout representatives insisted they weren’t breaking any rules, though they did agree to stop collecting the ghost profiles.

“In the wake of this catastrophic violation, Mark Zuckerberg must be forced to testify before Congress and should be held accountable for the negligence of his company,” he wrote. “Facebook has systematically failed to enforce its own policies. The only solution is external oversight.”

Jon Christian is a contributing writer for The Outline.