John Torous, M.D., M.B.I., is director of the Division of Digital Psychiatry at Beth Israel Deaconess Medical Center in Boston, where he investigates the risks and benefits of behavioral health technologies. Torous also chairs the American Psychiatric Association’s workgroup on smartphone apps. Transforming Care spoke to him about his research and views of the potential and pitfalls of new technology.
Transforming Care: We’ve never heard of a division of digital psychiatry. What is it?
Torous: Our division conducts research projects funded by industry, philanthropies, and the National Institutes of Health and helps educate clinicians as well as medical students and residents on how to evaluate digital health tools and incorporate them into practice. We also host a small digital clinic where we see people for face-to-face care and ask them to use apps outside of sessions to augment care. We work with a lot of people with schizophrenia, for instance, who now have access to smartphones but may not ever have had any formal training on how to use them for recovery.
Transforming Care: Your platform MindLAMP and a lot of your research focuses on digital phenotyping, which involves use of smartphones to track people’s sleep patterns, physical activity, and other patterns and monitor human–computer interactions. What have you learned so far about the reliability of these markers?
Torous: We know they have a lot of potential to inform us of functional outcomes, which has been the missing piece for this field. The tricky part is getting the data in a reliable way: there are so many different types of phones and operating systems and people use their phones very differently. What we’ve been focusing on now is metadata — looking at how someone interacts with the phone because doing so takes a certain amount of attention, memory, and engagement, and we can begin to measure those. For example, with one of the cognitive assessment games we have designed, we can see how long a patient with schizophrenia spends on a screen, how quickly they tap numbers, and whether they stop and get stuck in the middle. It’s a new way to look at cognition that gets around some of the privacy and ethical concerns that arise when you collect continuous passive data like GPS and call or text logs from phones.
Transforming Care: Why did you create your own platform for collecting digital signals?
Torous: We wanted something that was completely free and open source. A lot of us do similar research on digital phenotypes but the tools people are selling are prohibitively expensive. There are costs of maintaining an app platform, but they are not nearly as high as the fees many are charging for use of their app. And a challenge of proprietary tools is they don’t share their methods so are not reproducible. Because MindLAMP is free, it’s exciting to see its uptake. For example, researchers in China and Canada are using it to study schizophrenia and a group in Los Angeles is using it to study language and stress. Clinicians in the Netherlands and Nepal also are using it to learn new information about their patients. At the end of the day, it’s about being able to capture real-time data from our patients in a safe, secure, and reliable manner and combine that with passive data.
Transforming Care: How do these data get incorporated into practice? Can you give us an example?
Torous: We had a patient with schizophrenia who was becoming more depressed and psychotic. When we looked at the data with him, it was very clear something about his sleep was likely exacerbating his symptoms. We helped him monitor his sleep and he could see as his sleep improved, his symptoms got better. Until he saw it week by week, I don’t think he believed those two go together. With clinicians, we stress the importance of using data to open a dialogue. One of the big fantasies of this field is that data alone will fix everything. But what probably helps the most is the therapeutic alliance. Some digital health apps can help build that alliance by bringing in new kinds of information or making certain patterns more apparent. Patients will bring up things that wouldn't have come up before and this information offers a way for people to start a new conversation. The challenge with a lot of digital health apps is that the data go into a silo and it’s very hard to access it, let alone show your clinician on a visit. One app may track your medicine, another your cognitive-behavioral therapy (CBT). It's very hard to put it all together and understand how you are doing.
Transforming Care: As you look across the landscape of digital apps, where do you see the most activity and where are the missed opportunities?
Torous: We're definitely seeing a lot in terms of predictive analytics and digital phenotyping offerings that leverage information from online behavior and smartphone signals to discern patterns and identify at-risk patients. There is also a lot in what I would call CBT and mindfulness delivered to your phone. What we're not seeing as much is thinking about complete clinical pathways or tools that look across the patient journey. In some ways you could say the digital health tools are addressing potholes in the road. Many solutions today are more like carve-outs in that they aren’t easy to integrate into clinical care. There’s also a lot of duplication of apps and algorithms with less focus on how these tools can and should be used in actual care settings. (For examples of how digital health tools are being integrated into clinical care, see our companion piece.)
Transforming Care: You’ve also spent a fair amount of time thinking about how to evaluate apps in terms of privacy and safety. What are the biggest concerns in this regard?
Torous: The privacy concerns are hard to minimize, especially for a health plan looking to pick an app to invest in. We actually hacked some of these mental health apps and intercepted the traffic to see if what the privacy policy promises about where they send data is true and what we found is over half the apps were sending data to sources they never disclosed. We described our findings in JAMA, and it was shocking enough I went to the Federal Trade Commission in May to talk to them about what we found.
Transforming Care: We’ve read that some people may actually prefer talking to a chatbot or an avatar than to a counselor or psychiatrist. Can this kind of relationship have therapeutic value?
Torous: Yes, but what’s interesting is we don’t really know who might respond best to a virtual versus a human approach, so it’s really hard to look at someone a priori and say who’s going to benefit and who may be harmed. For example, some people with certain anxiety disorder will benefit from pushing through their comfort zone and engaging in nonvirtual interventions. We have to think about how we match the person to the tool.
Transforming Care: How about in terms of measuring effectiveness. Where do we stand now?
Torous: I think as app development becomes cheaper and people can build these tools, we are going to see more of a focus on studying how it works, who it works for, at what dose, and how long you should use it. Right now there are so many CBT programs, so many digital phenotype programs, and so many survey programs. Really what we need is research saying who does it work for and why does it work? A study recently found the mindfulness app Headspace worked no better than a placebo (a sham meditation app that didn’t teach users core mindfulness tenets or introduce progressive or varied techniques). That gets to the mechanism of action. As you can imagine, it's very hard to do that research. You have to have a platform just like Headspace or build one.
Transforming Care: Tell us about your work with the American Psychiatric Association. What are you trying to accomplish?
Torous: In essence, we’re trying to help clinicians make informed choices about digital health apps by teaching them how to analyze these apps rather than providing recommendations, which would be hard and often misleading to do because apps are so dynamic and personal. We published a framework in Lancet Digital Health that we developed with input from patients, clinicians, and payers to evaluate privacy, safety, data ownership policies, and other factors. Today when considering apps, I think a lot of people start with the look and feel. Those are important, but we need to get people thinking about privacy and safety first. With the American Psychiatric Association, we will be showing how we analyzed a few apps so people can see there is no magic formula or black box or score. You're just asking the right questions and because of that making a better decision.