The Future

Turns out our evil tech overlords may be no match for Europe’s new privacy laws

Thanks, GDPR.

The Future

Turns out our evil tech overlords may be no match for Europe’s new privacy laws

Thanks, GDPR.
The Future

Turns out our evil tech overlords may be no match for Europe’s new privacy laws

Thanks, GDPR.

Though it’s only been a few weeks since the European Union’s General Data Protection Regulation (GDPR) officially went into effect, its impact is already noticeable. Sites have gone dark and pared down their tracking-laden homepages, while users have had to struggle to stay afloat under the deluge of privacy policy update emails. But, strangely enough, the most interesting side effect of GDPR can be found outside of its regulatory borders. In a rather shocking turn of events, U.S. lawmakers, policy wonks, and academics alike appear to actually be inspired by the GDPR’s bold design and efficacy. New measures in the works would deny businesses the ability to share and sell people’s data willy nilly, more clearly identify where and to whom their personal information has been disclosed, and even require them to alert people if their data has been stolen within 72 hours. More importantly, experts are trying to figure out how to communicate this without people clicking through and scrolling past without actually absorbing what might be happening...and you’ve already stopped reading, haven’t you.

As dull as it may sound, things are ramping up enough to actually warrant your attention. Special assistant to President Trump for tech, telecom and cyber policy at the White House National Economic Council, Gail Slater, is actively looking into the feasibility of implementing similar regulations in the United States, Axiosreports. Similarly, the California Consumer Privacy Act, a ballot measure that’ll be up for adoption in November, aims to help consumers take back control of their personal information by creating a host of new privacy rights. The November 2018 ballot measure would give citizens the power to deny businesses the ability to share and sell their data, more clearly identify where and to whom their personal information has been disclosed, and would allot certain protections for those whose data has been misused. The Social Media Privacy Protection and Consumer Rights Act of 2018, introduced by Senator Amy Klobuchar, a Democrat from Minnesota, and John Kennedy, a Republican from Louisiana, takes this notion of all-american data protection a step further. The bill proposes that all companies that have mishandled users’ data (whether through general negligence, a breach, or what have you) must alert consumers within 72 hours, which is a bit more of a tight turn around than Facebook’s 3 years of silence post-Cambridge Analytica.

A Tuesday Senate hearing probing the state of potential national data protection included Ashkan Soltani, an independent privacy researcher and former chief technologist of the Federal Trade Commission (FTC); John Battelle, who serves on the board of directors for data broker, Acxiom, which used to work with Facebook; and Aleksandr Kogan, the former Cambridge University psychologist whose app scraped the data of 85 million Facebook users.

“I think what we forget as consumers is that anything that pops up between us and something we want is annoying and we want it to go away as quickly as possible,” Battelle said. “Even if it happens to be something that's quite important or good for us in the long run.” Battelle suggested that third parties partner with the industry in order to weave this informed consciousness into the product flow. “So that the moment at which data may be given up, you are reminded of the possible implications of that as opposed to just asked to dismiss a modal dialog box with an 'okay' button," he said.

“I think we need to shift away from this idea of blanket consent to informing people about the particulars,” added Kogan, who, it’s worth noting, was literally present at the hearing due to his abuse of Facebook’s blanket consent program.

“Any federal legislation should aim to address many of the key problems facing consumers online: lack of meaningful consent, inadequate data security practices, and a lack of any real transparency from large online companies,” wrote Soltani in a statement to the Senate Subcommittee. “A Privacy Bill of Rights is a term, and we need to make it real in ways that are enforceable and really protective on consumers without unintended consequences.”