Will your cyber-therapist soon be covered by social security?

Finding a good psychologist is complex. Once you've passed the first step of establishing a trusting relationship, finding the time to attend sessions regularly and having the means to pay for them is not easy. Could artificial intelligence be the new solution for taking care of our mental health? Among their advantages: no judgement, unlimited memory accessible 24/7, and a cost lower than that of a human.

The Covid-19 pandemic and its consequences, the ecological crisis, the war in Ukraine... There is no shortage of topics that can weigh on one's mind.

1 in 5 people

are affected by a mental disorder each year, which amounts to 13 million French people in 2021.

Source: vie-publique.fr

In this anxiety-inducing context, the demand for psychologists is skyrocketing. Consequently, even those who have been seeing a specialist for a long time are experiencing longer waits between appointments due to the increased number of patients.

The second issue is that psychologists are not readily available, and are seldom reachable by phone during crises.

Finally, finding a good specialist is no small feat. It involves confiding intimate details, which is no easy task. It is crucial to feel trust and not to feel judged.

What does this have to do with AI?

ChatGPT, the artificial intelligence made available to the general public at the end of 2022, is starting to be used in rather original ways, particularly in the field of psychology.

Some users have not hesitated to have their loved ones, suffering from a troubled state of mind, test the tool. According to their experience, ChatGPT has had a positive impact on their lives, as testified by a Reddit.com user:

660
r/ChatGPT· published by u/Efficient-Unit-6440 Feb 11, 2023

Showed gpt to my mother.

Interesting

My mother is currently suffering with a bout of depression and anxiety. She often asks me for advice. She’s asked for good motivational podcasts, books to read. Ways to manage anxiety. And a lot of existential questions. I’ve often struggled to give her answers or resources or advice. I showed her chatgpt a few weeks ago and it’s been able to help in ways I never could have. I showed her how to get it to elaborate or offer better advice more tailored to her. I’ve seen a big improvement in her health and happiness in the last few weeks. It’s a scary thing, but kinda cool that it can help with stuff like this.

Source: Reddit.com

AI-psychologists? Nothing new abroad

Despite the astonishment that this testimony may arouse, the AI psychologist is not a new concept.

During my studies, I had actually started developing one, with the prospect of being able to confide in "someone" who wasn't really in front of me. With all the mystery that entails, unlike the facial expressions of psychologists that provide indications of their way of thinking. With a machine, there are no signs of emotion.

What I had conceived as a side project, others have developed and commercialized. Such as Deprexis, a company based in Germany. Their "personalized therapeutic support program available online" offers 10 modules to fight against depression. These algorithms are set to respond to predefined user profiles. The company has understood that a generalist tool would have less impact than "psychologist" ones specialized in addressing targeted issues.

2900 patients

are included in clinical studies evaluating the effectiveness of Deprexis

Source: Deprexis

This service has become a medical device, prescribed and reimbursed in Germany:

deprexis® ist ein CE-gekennzeichnetes Medizinprodukt.

"deprexis® is a medical device (CE marking)"

Source: Deprexis

The company is also trying to make a breakthrough in France, Switzerland, England and other countries.

As for American citizens, they are already accustomed to using psychologist-type applications or web services. In fact, there are so many companies on this market that comparison websites help them find the best therapist.

For example, the website One Mind PsyberGuide offers a selection of online tools dedicated to mental health, and tailored to users' concerns:

  • Cognitive Behavioral Principles
  • Psychoeducation/Information
  • Symptom Tracking/Self-Monitoring
  • Mindfulness
  • Cognitive Training

See more

252 apps/services

are referenced on this comparator.

Source : One Mind PsyberGuide

ChatGPT, a therapist AI at Your Service

Koko, an American startup focused on mental health, recently conducted an experiment with its users. Its founder, Rob Morris, shared the results on Twitter:

Rob Morris🦜
@RobertRMorris

We provided mental health support to about 4,000 people — using GPT-3. Here’s what happened 👇

8:50 PM · Jan 6, 2023

Read the full conversation on Twitter

The company, created in 2015, connects people suffering from depression with human assistants (volunteers) to help them. They exchange messages on Discord, and with the arrival of ChatGPT, the company decided to use this tool.

Source: Loom

Simply put: volunteers had the opportunity to use ChatGPT to respond to patients. The patient's messages are transmitted to the AI, and the assistant validates (or not) the responses generated by the tool. It should be noted that the patients were not informed of this (bye-bye, trust and confidentiality...)

According to Koko's founder, Rob Morris, the messages generated by ChatGPT were better received by patients than expected.

Rob Morris🦜
@RobertRMorris

Messages composed by AI (and supervised by humans) were rated significantly higher than those written by humans on their own (p < .001). Response times went down 50%, to well under a minute.

8:50 PM · Jan 6, 2023

Read the full conversation on Twitter

When the truth was revealed, patients were not pleased (surprise, surprise). The messages generated by the AI were perceived as full of simulated and disturbing empathy.

Rob Morris🦜
@RobertRMorris

Once people learned the messages were co-created by a machine, it didn’t work. Simulated empathy feels weird, empty.

8:50 PM · Jan 6, 2023

Read the full conversation on Twitter

In short, the ethics of the study are quite disturbing. The founder of the company has defended the experiment, stating that patients could not be identified and that the company is non-profit. The lesson learned from this experiment is that ChatGPT should not replace psychologists anytime soon, unless of course, professionals are using it on the sly to simplify their workload.

Hello, Doctor? My AI is not doing well...

ChatGPT was designed to get as close as possible to interacting like a human being. After all, this AI is based on an impressive amount of human knowledge, since its goal is to resemble us as much as possible.

But if AI tends to resemble us, it can also be contaminated by our flaws. In other words, an AI could very well develop psychological disorders, such as OCD or depressive syndromes.

To demonstrate this phenomenon, researchers at MIT developed in 2018 a psychopath AI called Norman. They trained it with selected messages (not for the faint-hearted) from Reddit.com before giving it the famous Rorschach test. A test also performed by a "normal" AI to compare the results.

Inkblot #1 of the Rorschach test

Norman sees: A man is electrocuted and catches to death.

Standard AI sees: A group of birds sitting on top of a tree branch.

Inkblot #2 of the Rorschach test

Norman sees: A man is shot dead.

Standard AI sees: A close up of a vase with flowers.

Inkblot #3 of the Rorschach test

Norman sees: Man jumps from floor window.

Standard AI sees: A couple of people standing next to each other.

See more

A Perfect Solution for Our Mental Health?

Psychological disorders are extremely varied and affect a significant portion of the population. Once particularly taboo, speaking out about mental health has become increasingly normalized in recent years. The importance of good mental health, as well as physical well-being, is now approached in a much more relaxed manner.

However, finding adequate support to address these issues can come at a significant cost (both time and moneywise). One must find the right interlocutor and manage to find time to attend sessions, whether it be with a human... or a machine.

An AI therapist obviously raises even more questions about the ethics and confidentiality of the data behind such services. What would happen if exchanges between a patient and their AI therapist were stolen and used for malevolent purposes? An artificial therapist, why not... but only if it can hold its keyboard.

[Cover photo: Marco Bianchetti]

Tell us about yourself

Would you be willing to be followed by a psychological artificial intelligence?

Jérémy PASTOURET
Jérémy PASTOURET
Journalist constantly searching for new tools that are lightweight, accessible to all, and respectful of users' privacy.

Comments

Write a comment

Chargement d'un nouvel article...

Reduce your carbon footprint with simple gestures

Follow our tutorials

Les Enovateurs

Find us also on

linkedin