From the moment he wakes up to the time he goes to sleep, Robert Smith* keeps a digital remote in his pocket. The remote keeps a tally: each time he presses the button, it updates a daily count and uploads it to an online dashboard . Smith, a 29-year old engineer living in Dover, New Hampshire, was diagnosed with schizophrenia six years ago. He uses the remote to keep track of his auditory hallucinations — how many he hears every day, and at what times.
“People at work ask me, ‘What is it like?’” said Smith, referring to his hallucinations. “And I say, ‘Imagine you’re in your car, and you set your radio to scan, and so it’s going through the stations — it’s kind of annoying. It’s like having your radio scan in your brain.’”
Every time he visits his psychiatrist, Smith brings graphs documenting the hallucinations. In one graph that kept track of how many he had without medication — which Smith published in the journal JMIR Mental Health— his daily count hovered between 150 to 200. The rate steadily decreased to 50 as he went up on his dose of the antipsychotic drug aripiprazole. Documenting an objective count of his symptoms, he said, helped him and his doctor make informed decisions about treatment.
Smith is part of a new vanguard that sees the future of treating psychiatric conditions with data monitoring. Such clinicians and researchers are building smartphone apps to monitor symptoms and treatments communicate with patients, and predict relapse before a patient might even realize their health is declining. These apps can actively collect survey data from patients — much like Smith’s self-input tally tracker — but also passively monitor their digital presence, running in the background and recording their daily activities. As users go about their day, the apps can collect data on geolocation, accelerometry, sleep, and speech patterns, plus metadata about a patient's online activity and social communications. This data tells researchers about a user's usual activity, and is useful for flagging abrupt or concerning changes in a person’s online or offline habits.
At the core of this approach is a belief that metadata can tell us something useful about our mental health, and that a psychiatrist should be able to intervene — for example, by calling patients in for an earlier appointment, or checking in by phone — if the data tells them something could be wrong.
Beiwe is one such app and research platform, among a handful of new experimental, app-based approaches to data monitoring in psychiatry. Conceived by Harvard biostatistician JP Onnela, it generates what is called a “digitial phenotype” — a moment-by-moment quantification of a person’s behaviors and characteristics.
“It’s a very flexible platform for clinical research that allows the researcher to customize what data stream they want to collect from the smartphone,” said John Torous, a psychiatrist at Beth Israel Deaconess Medical Center and clinical fellow at Harvard who researches Beiwe’s clinical applications in mental health.
This spring, Torous and his collaborators published a three-month pilot study showing Beiwe’s ability to predict relapse in people with schizophrenia, which is crucial because earlier intervention means catching symptoms sooner, before they become harder to treat. The researchers collected in-app survey data, along with passively collected GPS, accelerometer, anonymized call and text message logs, and screen-time data. They aggregated the data to analyze participant's “mobility” (including metrics like time at home, travel distances, time spent stationary) and “sociability” (including stats on incoming and outgoing text message numbers and length). The researchers found that people who relapsed tended to show more anomalies in their “mobility” and “sociability” metrics two weeks before relapsing — with the rate of abnormalities 71 percent higher before relapse.
Beiwe is not yet used in clinical settings. “Right now, the app is only used for research — and in the informed consent process, people are told that we’re not monitoring this in real-time. So the analysis is being done retrospectively,” Torous said. “And in some ways it’s because you’re gathering so much new data and new signals, we almost have to learn: what do these signals mean? What do they indicate, or do well, or not well? It’s a relatively new tool and the research is still coming in.”
There are potential risks to collecting this type and volume of data in a medical setting. “The clinician will now know a great deal more about the patient than they ever have before,” said Paul Appelbaum, the director of the Division of Law, Ethics, and Psychiatry at Columbia University. The upshot is this information, like all medical records, could potentially become accessible to third parties, including courts or insurers.
“The detailed information in real-time about [a person’s] speech, behavior, patterns of activity, and location might be of intense interest to an investigating police officer, a prosecutor, or an adversary in a civil case,” said Appelbaum.
The potential risks also extend into interpersonal relationships. Monitoring apps might test the boundaries that clinicians set deliberately to distinguish their roles as professionals listening to their patients’ intimate stories. “We’ve never had the ability before to monitor patients closely outside the office,” said Appelbaum. “In terms of maintaining those boundaries... I as a clinician would know a great deal of detail about my patient’s life outside the office, without the patient making a decision... I'm now inserted into their life in a way that I never have been before.”
“It’s not really clear yet what [it] does to the patient-clinician relationship,” Appelbaum continued. “What does it mean to a patient when the clinician says, ‘I need more information than just what you’re telling me. In between sessions I want to be able to track you at all times.’”
Smith said it’s not clear yet whether there’s enough incentive for patients to accept being passively monitored by their doctors. “How do you convince the patient that this is worth sacrificing their privacy?” Smith asks. “How could you ensure that this isn’t going to work against them? If I’m giving you all this information, do I lose power? Because patients never want to lose power.”
“I would have to think about it before I would passively allow my doctor to tap my phone,” Smith said. “I would need to be convinced that it would be worth me giving up that power.”
Still, Smith notes the value of more objective data about his symptoms for informing decisions he made with his doctor about treatment. “One of the biases that I had before I was keeping an objective count was, if I've had a bad day — let’s say I got a parking ticket on my way to my psychiatrist’s office — when I show up, I’m going to be really flustered. And the psychiatrist is going to think that this is my current status over an entire month, because they only get that one snapshot.”
Torous is similarly optimistic about the potential of digital monitoring tools psychiatry and a new “digital therapeutic relationship” in which data collected in a trusted secure platform can empower patients and lead to more shared decision-making about treatment.
”We are planning to start a clinic that will use apps and digital tools as part of people’s care,” Torous said. “We’ll still see people face to face, but we’ll be using this type of technology to augment and extend. And we’ll be reviewing in sessions with people, so it's like this is a kind of interactive diary. But it would be reviewed with the psychiatrist in person, discussing it together. That would be the first step.”
Torous also cautioned against many of the existing direct-to-consumer apps that collect health data and sell it to pharmaceutical and biotech companies. Apps like Beiwe, he said, which aren't built by private companies, will be central to digital psychiatry's next steps, in which clinicians themselves can build secure platforms and determine best practices.
“We can’t predict the future,” Torous said, “but it’s probably better that we define how these tools are used in care, and we do it in an ethical and safe way — than some company telling us, a couple of years later, ‘This is how you practice.’”
- name changed