A study of mental health apps found that many of the most popular services fail to protect the privacy and security of their users. Following a report from last year’s “Privacy Not Included” guide, Mozilla researchers found that apps designed for sensitive issues like therapy and mental illness still collect large amounts of personal data under questionable or misleading privacy policies.
The team re-evaluated 27 of the mental health, meditation, and prayer apps from last year’s study, including Calm, Youper, and Headspace, in addition to five new apps that were requested by the public. Of those 32 apps in total, 22 were slapped with a “privacy not included” warning label, something Mozilla assigns to products most concerned about privacy and personal data. That’s a slight improvement on the 23 that earned the label last year, though Mozilla said about 17 of the 27 apps it revisited still scored poorly — if not worse – this time for privacy and security.
Replika: My AI Friend has been effectively banned in Italy due to data privacy concerns
Replika: My AI Friend, a “virtual friendship” chatbot, was one of the new apps analyzed in this year’s study and received the most attention. Mozilla researchers called it “arguably the worst app we’ve ever reviewed,” highlighting widespread privacy concerns and that it didn’t meet the foundation’s minimum security standards. Regulators in Italy effectively banned the chatbot earlier this year over similar concerns, claiming the app violated European data privacy regulations and failed to protect children.
BetterHelp was also highlighted for inappropriately sharing its customers’ sensitive data with advertisers such as Facebook and Snapchat after promising to keep such information private. In March, the online consulting firm agreed to pay the Federal Trade Commission $7.8 million to settle charges of such conduct. Other mental health apps reported to have terrible privacy and security practices include Pride Counseling (owned by BetterHelp), Talkspace, Headspace, and Shine. Mozilla also noted that Better Stop Suicide, Liberate, and RAINN are no longer supported and therefore unlikely to receive critical security updates to protect users.
Meanwhile, some of the apps that were on last year’s list saw some improvements. Youper is being highlighted as the most improved of the bunch, having reviewed its data collection practices and updated its password policy requirements to aim for stronger, more secure passwords. Moodfit, Calm, Modern Health and Woebot have also made notable improvements by clarifying their privacy policies, while researchers praised Wysa and PTSD Coach for being “head and shoulders above the other apps in terms of privacy and security.”
Mozilla says the results of this latest survey don’t necessarily mean you should stop using an app that scored poorly. The team left custom tips for each of the apps reviewed in the report to offer advice on how to maintain your privacy when using them.
Many of the issues outlined in Mozilla’s report tap into broader concerns about the privacy of mental health apps. Increased demand for these services during the Covid pandemic prompted lawmakers like Senator Elizabeth Warren last year to investigate the relationship between therapy apps and online advertisers, believing they could be unfairly taking advantage of customers’ sensitive data. Mozilla claims that the market for mental health apps alone has grown by about $1 billion since 2022.