The spectacular growth of mental health apps has created a risky industry


[ad_1]

WCHICKEN CAROLINE ESCUDERO was severely depressed, going to a therapist became difficult. So she joined BetterHelp, a popular therapy app. She was paying $ 65 a week, but spent most of her time waiting for a response from her personal advisor. She got two answers in a month. “It was like texting an acquaintance who has no idea how to deal with mental illness,” she says. BetterHelp says its service doesn’t claim to run 24 hours a day, all of its therapists have advanced degrees and “thousands of hours of practical clinical work,” and users can easily switch them if planning is difficult.

Listen to this story

Enjoy more audio and podcasts on ios Where Android.

Helping people cope with mental health problems has rarely been more urgent. The incidence of depression and anxiety has skyrocketed during the pandemic, by more than 25% globally in 2020, according to the Lancet, a medical journal. This, combined with more people using online services, has led to a boom in mental health apps. The American Psychological Association estimates that 10,000 to 20,000 are available for download. But evidence is mounting that risks to user privacy are being ignored. Nobody checks to see if the apps are working either.

Mental health tech companies raised nearly $ 2 billion in equity in 2020, according to CB Insights, a data company. Their products tackle issues ranging from general stress to severe bipolar disorder. Telehealth apps like BetterHelp or Talkspace connect users to licensed therapists. Subscription meditation apps like Headspace are also common. In October, Headspace bought Ginger, a therapy app, for $ 3 billion. Now that big companies are prioritizing employee mental health, some apps are working with them to help the entire workforce. One such application, Lyra, supports 2.2 million employed users worldwide and is valued at $ 4.6 billion.

Underneath, however, trauma lurks in some nooks and crannies of the industry. In October 2020, hackers who raped Vastaamo, a popular Finnish startup, began blackmailing some of its users. Vastaamo asked therapists to save patient notes online, but reportedly neither anonymized nor encrypted them. Threatening to share details of extramarital affairs and, in some cases, thoughts on pedophilia, on the dark web, hackers have reportedly demanded bitcoin ransoms from some 30,000 patients. Vastaamo filed for bankruptcy but left many Finns reluctant to disclose personal information to doctors, said Joni Siikavirta, a lawyer representing the company’s patients.

Other cases may arise. There is no one-size-fits-all standard for storing “emotional data”. John Torous of Harvard Medical School, who has reviewed 650 mental health apps, describes their privacy policies as appalling. Some share information with advertisers. “When I first joined BetterHelp, I started seeing ads targeted with words that I had used on the app to describe my personal experiences,” reports one user. BetterHelp says it shares with marketing partners only device identifiers associated with “generic event names,” only for measurement and optimization purposes, and only if users agree. No private information, such as talking to therapists, is shared, he says.

As for efficiency, application methods are notoriously difficult to assess. Woebot, for example, is a chatbot that uses artificial intelligence to replicate the experience of cognitive behavioral therapy. The product is marketed as clinically validated based in part on a scientific study which concluded that humans can form significant bonds with robots. But the study was written by people with financial ties to Woebot. Of his ten peer-reviewed reports to date, according to Woebot, eight feature partnerships with a principal investigator without any financial connection to him. All financially related co-authors are disclosed, he says.

Mental health apps were designed to be used in addition to clinical care, not instead of. With this in mind, the European Commission is taking stock of the situation. It is preparing to promote a new standard that will apply to all healthcare applications. A letter-based scale will rank security, usability, and data security. Liz Ashall-Payne, Founder of ORCH, a UK startup that has reviewed thousands of apps, including for the National Health Service, says 68% did not meet the company’s quality standards. Time to get back to the couch? â– 

For a more in-depth analysis of the biggest stories in economics, business and markets, sign up for Money Talks, our weekly newsletter.

This article appeared in the Business section of the print edition under the title “Psyber boom”

[ad_2]

Comments are closed.