Data is like glitter; once it’s out of the bottle it’s everywhere and it’s impossible to control. In the latest version of its data policy, Facebook declares that it will no longer share users’ information with data brokers, and that it is being ultra-clear about who will get data and where it will go. However, Facebook is still purporting to have a lot more control over the situation than it may actually have; putting together the picture that’s emerged from the last couple of weeks, things are a huge mess.
A few other stories from this week: Facebook revealed that 87 million users’ data was leaked to Cambridge Analytica, and not 50 million as originally reported. Actually, strike that; Facebook said in not so many words that if you are one of its 2.2 billion users, just assume someone else has and is able to actively use any data you’ve put into Facebook up until this point. Facebook revealed that actually, it was scraping call and SMS information from Android users’ phones, but will now graciously delete anything older than a year. Facebook also scans private Messenger photos and links to make sure they don’t violate Facebook standards.
Facebook has been reeling ever since Christopher Wylie of Cambridge Analytica came forward to talk about how his company has been using Facebook data it received from an academic researcher who scraped it all together with a single third-party Facebook app. While Zuckerberg apologized and the company is answering to both Parliament and Congress, its actions since have shown that it has little functional need to worry very much. As the EFF points out, Facebook has at this point amassed so much data internally for marketing and advertising purposes it simply no longer has any need to sell what it has outside the company to give advertisers what they want. Facebook could keep doing this business, but it didn’t take much for the negative PR appearances to outweigh its need; Facebook could simply drop it.
Not that this matters very much to individual humans: data from Facebook has been circulating for so long outside of Facebook, it can’t have very much more to give about any particular user, anyway. Data is frequently shared and sold between massive data brokers, and requesting that you be deleted from one of them means it’s only a matter of time until your information gets shared by another. Facebook cannot substantively change anything by changing its policies now; data is out there in the world and Facebook has essentially no control over who does what with it.
An open question in all this, even as the EU enacts much stricter privacy protection laws, is what happens when data changes hands between third, fourth, and even more distant parties, and issue that this Verge piece acknowledges is untrodden territory. If a third-party Facebook app slurps up information from Facebook, then turns around and sells it to a marketing company (even though this is explicitly forbidden in Facebook’s policies), what can Facebook do about it? Facebook has no eyes on that information once it’s outside of Facebook’s servers. What if it was actually sold to two dozen marketing companies, but Facebook only finds out about the first dozen? Even if Facebook hunts down the first dozen and asks the companies to delete it, it’s impossible to know with certainty that anything is truly deleted once it’s recorded.
Even Facebook’s new policies would not prevent a situation like this from happening again — an app can call down all the data users will give it, it can hold onto it, it can hand it off, and Facebook might never find out. If Facebook does find out, it can probably go after the company in question for threatening the security of its private business. But even then, the laws would protect these companies from each other, not the users whose data was traded around.