Health
Will Your Future Smart Device “Red Flag” You?
Digital apps may change the future of mental health assessment and treatment.
Posted June 17, 2022 Reviewed by Tyler Woods
Key points
- In the future, mental health apps may change the way we view mental health assessment and treatment.
- Mental health apps using body-worn sensors may someday provide useful mental health diagnoses.
- New rules are needed for future digital mental health records.
- Will smart devices someday provide mental health "red flags?"
There is a recent push for new mental health strategies to prevent violence and other social ills. One method being explored is new technology innovations such as “Mental Health Apps” (MHAs), which offer new opportunities to reach patients and address risks. But what rules and strategies must emerge along with the advent of MHA technology?
Mental health apps have been available for some time, as mentioned in a prior article. The first-generation MHAs mostly provided reminders and positive messages, which could be helpful for mindfulness, sleep hygiene, life/illness management, and skills training. Unlike human therapists, digital mental health apps are available 24/7. Besides providing journaling prompts and inspirational messages, mental health apps also collect passive self-report data. User responses are kept in a database and analyzed to provide feedback.
New-generation MHAs integrate biosensors and devices such as smartwatches, phones, or sensor pads to monitor fluctuations in the user’s daily signals. The latest devices record data: from physical activity to sleep data, skin resistance, temperature, blood oxygen levels, ECG, fall detectors, and even emergency medical alerts. These body-worn devices provide automatic monitoring of readings and activity to lessen the burden of patients having to input the data. The newest MHAs crunch all that bio-psych data using algorithms to identify trends, and employ AI to provide feedback. In the near future, they will likely also offer preliminary diagnoses and even treatments. For example, your future MHA biosenses an unusually high-stress reading and perhaps recommends a wellness checklist or relaxation module. You engage in a conversation with your AI therapist, and your device lets you know when your metabolism returns to a healthier level.
But questions remain: Where is the use of mental health monitoring data going in the future? What guardrails are needed for mental health data collected by MHAs and digital devices?
Several steps can be considered:
- Psychologists must validate the accuracy of MHAs. Consider the consequences of misdiagnoses, false positives, or false negatives. Beta testing an app is not as thorough as conducting clinical trials.1 Clinicians can partner with engineers and software developers to make MHAs more accurate, safe, and effective. The future of digital therapeutics requires clinical trials on efficacy and consumer education about uses and abuses of new technologies. For example, some researchers conducted trials of internet-based cognitive behavioral therapy for diagnoses of depression and anxiety.2 Such well-controlled research is needed for the use of MHAs and body-worn sensor data to build acceptance and accuracy.
- Rules are needed for how MHA data will be shared. Will user data go to digital mental health records? Will this data be able to provide patients greater risk assessment and access to treatment? On the other hand, how or when will mental health data be used to "red-flag" those considered a risk to themselves or others? What will be the procedure to get a second opinion, or question your AI-based diagnosis? How can users remove a red flag if an MHA algorithm determined it was appropriate? Strict user permissions and privacy protections are crucial for the new digital mental health records frontier, especially if we want patients to adopt and use the new technology.3
- MHAs will eventually evolve toward providing treatments. In the future, perhaps a high-risk score will trigger MHA recommendations to seek therapy, or guide potential patients to mental health services. Soon, virtual mental health assistants might serve as confidential sounding boards, prompting users to divulge their problems, stories, and feelings. Perhaps some folks will prefer “therapy” with an anonymous, nonjudgmental robot? This will be the brave new future world of computer-mediated assessment and therapy. Innovation and testing are still needed, but great potential exists for these technologies to guide services to address mental health concerns.4
As MHAs gain acceptance, the developers and clinicians will have to consider establishing rules to protect user privacy. Circumstances in which MHA data might be ethically and legally used to enhance public safety should also be established. The key is to balance the privacy rights of patients and HIPAA compliance with the desire to identify and intervene during mental health crises.
Password: “Take a Balanced Approach.”
References
(1) Seppälä, J., et al. (2019). Mobile Phone and Wearable Sensor-Based mHealth Approaches for Psychiatric Disorders and Symptoms: Systematic Review. JMIR mental health, 6(2), e9819. https://doi.org/10.2196/mental.9819: PMC6401668.
(2) Richards, D., Enrique, A., Eilert, N. et al. (2020). A pragmatic randomized waitlist-controlled effectiveness and cost-effectiveness trial of digital interventions for depression and anxiety. npj Digit. Med. 3, 85 (2020).
(3) Bessenyei, K.; Suruliraj, B.; Bagnell, A.; McGrath, P.; et al. (2021). Comfortability with the passive collection of smartphone data for monitoring of mental health: An online survey, Computers in Human Behavior Reports, Vol 4, 2021, 100134, ISSN 2451-9588,
DOI.org/10.1016/j.chbr.2021.100134
(4) National Institute of Mental Health (NIMH): Technology and the Future of Mental Health Treatment. Available Online: www.nimh.nih.gov/health/topics/technology-and-the-future-of-mental-health-treatment