Analyst comment
Are therapists safe from AI disruption?
For mental health apps to function appropriately, they must prove their safety and efficacy and be regulated with the same rigor as human mental health professionals.
Therapists are expensive, and accessing mental health services can be challenging. In the UK, one in four people wait more than 90 days to get access to mental health services after a referral.
Chatbot therapists, which incorporate artificial intelligence (AI) and can be downloaded onto a smartphone, are cheap, quick, and available 24/7. There are already thousands of mental wellness and therapy apps available, including Wysa and Youper.
Generative AI has improved these apps, allowing users to have conversations that more closely resemble regular human interactions. According to GlobalData’s recent AI executive briefing, the global AI market with grow from $103bn in 2023 to $1,037bn by 2030. AI and automation are disrupting many industries, so should therapists be worried about the threat from AI?
Traditional therapists go online
Engaging in traditional talking therapies requires people to get dressed, leave the house, and deal with human interaction. This can be challenging for individuals struggling with issues like depression. However, a growing number of therapists offer online appointments through phone calls or video conferencing tools like Teams or Zoom. Demand for these types of services increased during the pandemic and has remained despite the end of lockdown restrictions.
Relationships with therapists can be complex. Finding a therapist that works for them can take individuals a few tries. Chatbots’ behaviour is programmed, and they can be trained to respond in a way that works for the user. With chatbots, there is also the possibility that patients could be more comfortable confiding in them than in a real person.
The human touch versus chatbot therapy
However, the bond that can be established between therapist and client is important. The client profits from feeling understood and seen. Therapy is a two-way street, where the human connection is key to healing. Speaking with a chatbot about your problems could inhibit the individual’s ability to interact with other humans. This is an issue, especially as one of the main objectives of mental health therapy is to help someone interact with the world around them and establish healthy relationships with others.
Importantly, chatbots also cannot assess body language and tone. These communication skills are vital as they tap into a client’s unconscious and natural instincts. Reading a patient’s body language is also key to understanding how someone feels and reacting appropriately. For mental health apps to function appropriately, they must prove their safety and efficacy and be regulated with the same rigor as human mental health professionals. For example, psychologists in the UK must ensure confidentiality and are monitored by the Health and Care Professions Council.
However, mental health apps can avoid certain regulations by not admitting they are mental health apps. This is dangerous as the apps could, in that case, provide inappropriate or even detrimental advice.
AI can be a useful tool for mental health services but cannot replace therapists. For example, AI can give real-time feedback during a session or develop personalised patient treatments. In addition, chatbot therapists can be an option for those who have difficulty accessing human practitioners. However, it is important to remember that chatbots are not a valid replacement for human therapists, and their availability does not mean that society should stop trying to improve the availability of mental health services.
Credit: @GipsyHillBrew / X