The Future

85%
The accuracy of an algorithm that uses Apple Watch data to predict early signs of diabetes, high blood pressure, and other health issues.
The Future

If your Apple Watch knows you’ll get diabetes, who can it tell?

Silicon Valley is starting to publish research using Apple Watch data, but there are still no laws about what information they keep, sell, or share.

On the morning of February 7, the Hilton New Orleans Riverside was packed with researchers for the final day of the Association for the Advancement of Artificial Intelligence's thirty-second annual conference. Avesh Singh and Johnson Hsieh were there to present the first published research from their app Cardiogram, made using Apple’s HealthKit app developer.

Cardiogram’s machine learning algorithm, called “DeepHeart,” ingested large amounts of health data, like heart rates and step counts, from wearables like Apple Watches and FitBit volunteered by participants. It also used information about users’ exercise, sleep, and emotional habits, which was handed over to Cardiogram from Health eHeart, a research project from the University of California San Francisco. With all this information, DeepHeart predicted early signs of diabetes, high cholesterol, high blood pressure, and sleep apnea in more than 14,000 people. It turned out DeepHeart was accurate 85 percent of the time.

By tantalizing people with the idea of predicting diabetes or other health issues, apps like Cardiogram or devices like the Apple Watch get people to volunteer personal biometric data that can be synthesized together. But these companies can pass that data on to new services, apps, and research projects, without giving people a clear sense of where the data is going or what their rights are. Sometimes the data is collected in such a way that the people who were talked into volunteering it have few rights to it at all — not who sees it, where it lives, or how long it’s kept.

In a phone conversation, an Apple representative could not confirm whether insurance companies, pharmaceutical companies, or bioinformatic companies could receive this information. The iOS App Store Privacy Policy says apps can distribute this data to unspecified “health management” and “health research” companies.

Cardiogram definitely won’t be the last free app that uses AI like DeepHeart to study medical data. With newly intimate access to the ins and outs of people’s lives using the Apple Watch, Apple and its third party developers have been actively pursuing the world of medical data and health research. Just this month, Apple launched the “Apple Heart Study” with Stanford Medicine as well as “Close Your Rings,” an Apple Watch page that actively encourages users to make healthy lifestyle choices throughout the day.

Although there are laws in place to protect medical data, they don’t apply to Apple Watches and other wearables. Medical information, such as heart rate data, is collected on Apple Watches and other wearables. However, wearables would have to be approved as a “medical device” by the FDA in order for that data to be protected by law.

Apple Watches, FitBits, and other wearable devices haven’t been approved by the FDA as medical devices, meaning there aren’t any laws to protect how that medical information is stored and secured, or sold to third parties. In essence, when a wearable collects your medical data, consumers don’t know exactly who gets it, what they use the data for, and whether it could affect real-life metrics such as insurance rates.

Heart rate readings fed to Cardiogram by users who elected to enter the program.

Heart rate readings fed to Cardiogram by users who elected to enter the program.

Apple has been actively communicating with the FDA regarding medical device regulations since at least 2013, including joining a precertification pilot in November of 2017 for FDA regulation, which would allow it to be regulated as a firm rather than have to individually get their products FDA-certified.

Kendra Serra, a technology lawyer at the Cyber Law Clinic, told The Outline in a phone call that so long as Apple Watches aren’t classified as medical devices, there’s a legal loophole in The Health Insurance Portability and Accountability Act of 1996, or HIPAA, which would protect medical data collected by what the FDA considers “medical devices.”

HIPAA limits the use of medical information to necessary medical uses only. A pharmacist at Walgreens, for instance, can’t sell your prescription information to advertisers, and Walgreens needs to have measures (such as encryption) to protect your information. But if Apple Watches aren’t medical devices and HealthKit apps aren’t medical software, then the data it uses and analyzes isn’t HIPAA-protected medical data, so there are no rules except what companies impose upon themselves.

Serra noted that HIPAA compliance protects focuses the way data is stored, but not the way it’s shared. “Generally, what worries me is the way in which this data can end up places you wouldn't necessarily predict — so the kinds of third parties who may have access to [data] not through a security breach, but just through how data get shared,” Serra said. “Companies that are maybe moving...to gathering more highly sensitive data haven't necessarily taken the steps to secure secure it correctly.”

According to the iOS App Store rules, all apps which store user health or medical data must have a privacy policy, obtain user consent, and can’t sell user data to third parties such as advertisers—however, this is far from equivalent to HIPAA compliance, which apps can abide by on a voluntary basis. While Health eHeart voluntarily abides by HIPAA regulations, Cardiogram does not.

RadioShack very nearly sold the social security numbers, birth dates, and phone numbers of its 117 million customers.

Cardiogram is run by a private Silicon Valley start-up. Information it collects can be used to identify and locate users because it isn’t anonymized and is kept indefinitely, while Cardiogram’s own privacy policy states it does not give out personally-identifying information to third parties, it does say it may disclose personal information in the event of a “corporate re-organization” or “a sale of all or a substantial portion of our assets”. This calls to mind the time that RadioShack very nearly sold the social security numbers, birth dates, and phone numbers of its 117 million customers when it liquidated its business before it was blocked by the FTC and dozens of state attorneys. According to Natasha Tusikov, a professor at York University, unsecured biometric data should be a concern for users.

“People will give up sensitive health data if they think the company is properly taking care of and securing that data,” Tusikov said. “if they've categorized it wrong, if they know they haven't anonymized it properly, this is a significant security and privacy risk.”

Tusikov said that the individual Privacy Policies and Terms of Use for apps also serve as legal contract for whether information on their apps go to third parties (and if so, which third parties get the data). But no one reads these, and even if they seem palatable at first, they can change at any time.

“These Terms of Service agreements are updated frequently,” Tusikov said. “People don't have the time, sometimes the legal knowledge, the expertise, or just the ability to keep up with all of the various services and apps that they subscribe to. What [users] don't realize is that there are other third parties involved.”

Practices for data storage for apps using the HealthKit API — meaning how long information is kept, or exactly what software and hardware is used to store the data — are regulated by a set of App Store recommendations, not rules.

Theoretically, a private Silicon Valley startup that created a medical app could keep that information on vulnerable software and hardware, forever. For instance, Cardiogram currently actually holds its users’ data “for as long as it remains necessary for the identified purpose.” This means even if users delete Cardiogram, it still has the data indefinitely.

Wearables like the Apple Watch have the potential to provide a ton of new insight to health research, because the data they can collect can be so comprehensive and granular. A Cardiogram user could hardly be blamed for wanting to use their new Watch to figure out if they are showing early symptoms of diabetes. But it’s still too easy for companies to take advantage of mortality anxiety without giving people the protections and privacy they are owed, especially when the enormity of third-party data brokering is, at this point, almost beyond comprehension.