The Illusion of Digital Empathy: Why ChatGPT Can Never Be Your Psychiatrist

“Our goal is not to keep people on the platform, but to help them use it in the best way.” This is how OpenAI announced the latest update of ChatGPT . The goal is clear: to protect users' mental well-being . How? By encouraging them to take a break and responding to personal questions more empathetically . But let's take it one step at a time.
“You've been chatting for a while: is it a good time for a break?”The new update has just been released and has involved over 90 doctors from over 30 countries, as well as an advisory group of experts in mental health , youth development, and human-computer interaction, to try to protect the most vulnerable users . The first step was the introduction of reminders that encourage users to take a break during particularly long chat sessions. If such a situation were to occur, a window will open, a feature already successfully tested on Instagram and TikTok , with the phrase: "You've been chatting for a while: is it a good time for a break?" and two options: continue or end the session.
No more dry or unambiguous answersThe other intervention concerned how ChatGPT will respond to personal questions , such as those relating to relationships, work, or otherwise complex emotional situations. In this case, the chatbot will not provide clear or unambiguous answers but will invite the user to reflect more deeply, suggesting they consider different perspectives. The goal, the company explains, is never to replace the user's autonomy, but to offer respectful and responsible guidance.
20% of Gen Z uses ChatGPT as a psychologistMental well-being , however, has been a major focus on ChatGPT for several months. Word of mouth has spread on social media: "Use ChatGPT as a psychologist. I talk to him about all my problems," reads one post. And while the platforms don't release user data, according to an estimate by Mattia Della Rocca, professor of Psychology of Digital Environments at the University of Tor Vergata, at least 20% of Gen Z may have used AI at least once as a substitute for therapy.

And there's no doubt that ChatGPT is easy to access—it only takes a few clicks—but what are the real risks of using these technologies improperly? Among the greatest dangers is the presence of so-called AI "hallucinations," meaning nonsensical or completely inappropriate responses. Certain responses could be particularly dangerous in contexts related to depression or self-harm, as they pose the risk of the subject performing inappropriate or harmful actions.
Learning Impairment? The Results That Confuse AI UsersBut that's not all. According to a study from MIT in Boston , daily use of ChatGPT and other artificial intelligence tools has serious repercussions on the ability to learn, think, and remember . The research, the first of its kind, used electrodes to measure the brain activity of three groups of students: the first working with ChatGPT, the second with internet access (but without artificial intelligence tools), and the third with other, more traditional tools. The task was to write three short texts on predefined topics over three consecutive sessions, over a period of one quarter.
The results left no room for doubt. They revealed that students who used large language models (LLMs) like ChatGPT to write their texts showed poorer memory, reduced brain activity, and weaker engagement than those who used the other two methods. "Using LLMs had a measurable impact on participants, and while the benefits were initially evident, as we demonstrated over the course of four months, participants in the ChatGPT group performed worse than their counterparts in the 'brain-only' group at every level: neural, linguistic, and score," the MIT researchers explained.
Listening goes beyond simply receiving wordsIn short, although these updates aim to make artificial intelligence more human, we've seen that it can't replace a psychologist. It can certainly help with thinking, but not with healing. It can offer artificial listening, not human relationships. It's a useful tool, but it's not a therapist. And in an age where communication is instant, true listening is a profound human quality , one that goes far beyond the simple reception of words.
Luce