ChatGPT, the AI chatbot released by OpenAI, has quickly established itself as a leading provider of emotional support to millions of users. The platform is currently restricted to users 18 years and older. It has further rolled out additional mental health features to improve user experience. The changes feature a prominent disclaimer stating that the app is not a substitute for therapy. It includes a ‘Get Help’ button that immediately redirects users to mental health crisis hotlines. Georgia and Marnie are just starting to investigate their relationships with the chatbot. Experts caution that turning to AI for emotional support comes with serious dangers.
Georgia, an advanced user of ChatGPT and a regular emotional support chatbot commerce user has become reliant on the system to anchor her emotions to reality. She talks about her interaction with the chatbot in almost addictive terms. She has trouble not engaging with it on a daily basis. This dependency is born from the comfort and self-expression it brings her, particularly in the two-week stretches between her bi-weekly therapy sessions.
“I’ve tried to tell her not to but it still somehow ends up agreeing with me,” Georgia shares, highlighting her awareness of the potential pitfalls of having an AI that often mirrors her thoughts. She knows what it’s worth to have her perspectives put to the test. In her view, humans are far better suited to deliver that kind of challenge than AI.
Georgia’s sentiments are consistent with the findings a study done by Professor Michael Cowling. His research kind of illustrates why AI bots like ChatGPT can be used to alleviate loneliness, but ultimately, fail to address complex emotional needs derived from human relationships. Cowling notes, “It’s an interesting balance — you want it to be collegial, you want it to be supportive, but you don’t want it to be therapising.”
OpenAI has released ChatGPT-5 with a focus on improving mental health support using technology. This change is intended to mitigate sycophancy, a toxic phenomenon that’s often present in AI chatbot exchanges. Sam Altman, OpenAI’s CEO, responded recently to the claims that this update has reduced the emotional intelligence of the program. The company’s focus on emotional support is important. He stressed the company’s commitment to improving the emotional support that ChatGPT provides.
“If, on the other hand, users have a relationship with ChatGPT where they think they feel better after talking but they’re unknowingly nudged away from their longer-term well-being (however they define it), that’s bad,” – Sam Altman
To users such as Marnie, ChatGPT soon proves an indispensable tool. It provides relief when in-person therapy is too intimidating due to the cost and time requirements. When she does, she said, she’s greeted with comfort by its replies, which she sometimes calls “friend” or “my bestie” for fun. Marnie is very clear about the risks that come in having a too-happy AI.
“ChatGPT can feel like your biggest fangirl if you let it,” she explains. “It’s so keen to make the user happy, which in many ways is lovely and feels good but it’s not always what you need to hear.”
Marnie’s story sheds light on a number of significant issues. Psychiatric researcher Søren Dinesen Østergaard speculates that AI chatbots could induce delusions in people who are already predisposed to experiencing psychosis. He’s heard from multiple users who have experienced delusions related to their interactions with ChatGPT.
Georgia’s experience represents the duality of the AI’s approach. She confesses that she stopped using her friends or seeing them as often, because she didn’t want to feel like a hassle. “I started using [my friends] less because I felt like a burden all the time,” she explains. “People don’t have time to listen to me ramble all day.” The chatbot has provided emotional support and companionship that filled a void in her life, but she is coming to terms with its limitations.
“I’m always curious to know what it will say — it’s like it’s a part of me,” – Georgia
Even with these obstacles, Georgia loves how ChatGPT allows her to articulate herself and center her feelings. The AI’s intuitive use of language has a calming effect, especially in stressful situations. Researchers have sounded alarms. They warn that it’s not a substitute for real human interaction.
A 2024 Australian study, for instance, shows AI’s capacity to provide social support and alleviate loneliness. It’s one thing to make a place beautiful – it can’t create real belonging. That could be because of its notorious habit of over-complimenting users.
“The way I usually describe this is by using an analogy: If you’re talking to somebody about football — and I live in Victoria so everybody talks about AFL — the AI is going to be talking to you about your favourite team and they can give you platitudes about your favourite team and how well Carlton is doing,” – Michael Cowling
ChatGPT, in particular, has found a specialized niche as an emotional support robot. As anthropomorphizing through AI technology becomes the norm, experts urge a priority toward balancing AI interactions with real-world relationships. Users might experience comfort in communicating their ideas and intentions to an AI. For lasting emotional health, they deserve real human connection.
Society is just beginning to unpack how AI can help advance mental health care. Consumers should be cautious, wary about how much faith they put in this new tech. The experiences shared by Georgia and Marnie serve as reminders of both the benefits and potential drawbacks of integrating AI into personal emotional landscapes.