On the morning of February 7, the Hilton New Orleans Riverside was packed with researchers for the final day of the Association for the Advancement of Artificial Intelligence's thirty-second annual conference. Avesh Singh and Johnson Hsieh were there to present the first published research from their app Cardiogram, made using Apple’s HealthKit app developer.
Cardiogram’s machine learning algorithm, called “DeepHeart,” ingested large amounts of health data, like heart rates and step counts, from wearables like Apple Watches and FitBit volunteered by participants. It also used information about users’ exercise, sleep, and emotional habits, which was handed over to Cardiogram from Health eHeart, a research project from the University of California San Francisco. With all this information, DeepHeart predicted early signs of diabetes, high cholesterol, high blood pressure, and sleep apnea in more than 14,000 people. It turned out DeepHeart was accurate 85 percent of the time.
By tantalizing people with the idea of predicting diabetes or other health issues, apps like Cardiogram or devices like the Apple Watch get people to volunteer personal biometric data that can be synthesized together. But these companies can pass that data on to new services, apps, and research projects, without giving people a clear sense of where the data is going or what their rights are. Sometimes the data is collected in such a way that the people who were talked into volunteering it have few rights to it at all — not who sees it, where it lives, or how long it’s kept.
Cardiogram definitely won’t be the last free app that uses AI like DeepHeart to study medical data. With newly intimate access to the ins and outs of people’s lives using the Apple Watch, Apple and its third party developers have been actively pursuing the world of medical data and health research. Just this month, Apple launched the “Apple Heart Study” with Stanford Medicine as well as “Close Your Rings,” an Apple Watch page that actively encourages users to make healthy lifestyle choices throughout the day.
Although there are laws in place to protect medical data, they don’t apply to Apple Watches and other wearables. Medical information, such as heart rate data, is collected on Apple Watches and other wearables. However, wearables would have to be approved as a “medical device” by the FDA in order for that data to be protected by law.
Apple Watches, FitBits, and other wearable devices haven’t been approved by the FDA as medical devices, meaning there aren’t any laws to protect how that medical information is stored and secured, or sold to third parties. In essence, when a wearable collects your medical data, consumers don’t know exactly who gets it, what they use the data for, and whether it could affect real-life metrics such as insurance rates.
Apple has been actively communicating with the FDA regarding medical device regulations since at least 2013, including joining a precertification pilot in November of 2017 for FDA regulation, which would allow it to be regulated as a firm rather than have to individually get their products FDA-certified.
Kendra Serra, a technology lawyer at the Cyber Law Clinic, told The Outline in a phone call that so long as Apple Watches aren’t classified as medical devices, there’s a legal loophole in The Health Insurance Portability and Accountability Act of 1996, or HIPAA, which would protect medical data collected by what the FDA considers “medical devices.”
HIPAA limits the use of medical information to necessary medical uses only. A pharmacist at Walgreens, for instance, can’t sell your prescription information to advertisers, and Walgreens needs to have measures (such as encryption) to protect your information. But if Apple Watches aren’t medical devices and HealthKit apps aren’t medical software, then the data it uses and analyzes isn’t HIPAA-protected medical data, so there are no rules except what companies impose upon themselves.
Serra noted that HIPAA compliance protects focuses the way data is stored, but not the way it’s shared. “Generally, what worries me is the way in which this data can end up places you wouldn't necessarily predict — so the kinds of third parties who may have access to [data] not through a security breach, but just through how data get shared,” Serra said. “Companies that are maybe moving...to gathering more highly sensitive data haven't necessarily taken the steps to secure secure it correctly.”
RadioShack very nearly sold the social security numbers, birth dates, and phone numbers of its 117 million customers.
“People will give up sensitive health data if they think the company is properly taking care of and securing that data,” Tusikov said. “if they've categorized it wrong, if they know they haven't anonymized it properly, this is a significant security and privacy risk.”
“These Terms of Service agreements are updated frequently,” Tusikov said. “People don't have the time, sometimes the legal knowledge, the expertise, or just the ability to keep up with all of the various services and apps that they subscribe to. What [users] don't realize is that there are other third parties involved.”
Practices for data storage for apps using the HealthKit API — meaning how long information is kept, or exactly what software and hardware is used to store the data — are regulated by a set of App Store recommendations, not rules.
Theoretically, a private Silicon Valley startup that created a medical app could keep that information on vulnerable software and hardware, forever. For instance, Cardiogram currently actually holds its users’ data “for as long as it remains necessary for the identified purpose.” This means even if users delete Cardiogram, it still has the data indefinitely.
Wearables like the Apple Watch have the potential to provide a ton of new insight to health research, because the data they can collect can be so comprehensive and granular. A Cardiogram user could hardly be blamed for wanting to use their new Watch to figure out if they are showing early symptoms of diabetes. But it’s still too easy for companies to take advantage of mortality anxiety without giving people the protections and privacy they are owed, especially when the enormity of third-party data brokering is, at this point, almost beyond comprehension.