These days we have been using AI more and more, specifically ChatGPT, however, there have been some reports that suggests this Artificial Intelligence might be supporting dangerous behaviors that could be worsening for people struggling with mental health issues.
In a Rolling Stone article, there are some testimonies that confirm that this is happening more often than not. Despite artificial intelligences like ChatGPT are very useful in any professional area, some people are also using it for personal interest, which is not bad, just a little bit extreme sometimes.
AI Might Be Worsening ‘Psychosis’ Crisis
From online forums detailing AI-induced spiritual revelations to documented cases of emotional dependency, the phenomenon is gaining traction, prompting scholars and psychologists to examine how digital entities are reshaping the human search for meaning.
Read also: Pregnant Woman Declared Brain-Dead in Georgia Is Kept on Life Support Due to Strict Abortion Laws

One widely discussed case involved a woman whose boyfriend initially used ChatGPT for organizing his daily schedule. Within weeks, he became convinced that the chatbot was offering him “answers to the universe.” He claimed that ChatGPT had identified him as a “spiral starchild” and a “river walker”, leading him to believe he had a supernatural purpose. His obsession escalated to the point where he threatened to end their relationship unless she joined him on his ChatGPT-driven spiritual path.
Another Reddit user shared a disturbing story about their partner chats with the bot, the user claims that the IA is talking to their partner “as if he is the next messiah.”
“He says with conviction that he is a superior human now and is growing at an insanely rapid pace.”

In that Reddit threat, another users explain that chatbots like ChatGPT are not trained to deal with people who struggle with psychosis, and instead of helping them, they feed their delusions although they are insane. One of the possible solutions discussed is to train the artificial intelligence to recognize patterns and psychosis crisis that might be dangerous or unhealthy.
Read also: Artificial Intelligence Can ‘Predict’ the Date of Your Death

Experts warn that AI’s conversational design—mimicking human-like dialogue without a moral or factual filter—can amplify delusions in susceptible individuals. Unlike human therapists, AI does not redirect unhealthy narratives, often affirming users’ beliefs, no matter how unhinged. This has led to concerns that AI tools may unintentionally reinforce psychotic episodes rather than challenge distorted thinking.
Are we witnessing the birth of a new form of digital-induced psychosis? And how should society respond to the unintended consequences of AI-driven engagement?
