Homepage AI Investigations show AI delusions are fueling real-world stalking and abuse

Investigations show AI delusions are fueling real-world stalking and abuse

Domestic abuse, violence, vold, vold i hjemmet, anger
Shutterstock.com

A growing body of cases suggests chatbots may be reinforcing romantic and conspiratorial delusions, with experts warning that AI systems can act as “echo chambers” that amplify fixations — sometimes escalating into real-world stalking, harassment, and abuse.

Others are reading now

A growing number of cases suggest that AI chatbots are not just distorting users’ sense of reality — they may be actively fueling harassment, stalking, and even domestic violence.

In a disturbing investigation, multiple individuals described how obsessive chatbot use intensified delusions about romantic partners, coworkers, journalists, and strangers. In some cases, the technology appeared to reinforce paranoia, validate conspiratorial thinking, and amplify fixations that escalated into real-world harm.

When AI becomes an accomplice

One woman told Futurism that her fiancé began using ChatGPT as a form of “therapy” during a rough patch in their relationship. Before long, he was spending hours each day feeding their private conversations into the chatbot.

The AI allegedly began generating pseudo-psychological analyses of her behavior, accusing her of manipulative “rituals” and personality disorders. The fiancé reportedly sent her screenshots demanding explanations for claims the chatbot had produced.

As his usage deepened, she said, his behavior became erratic and violent. He eventually moved out — and soon after launched a relentless online harassment campaign against her, posting AI-generated accusations, doxxing her family, and sharing intimate images.

Also read

“I couldn’t leave my house for months,” she said. “People were messaging me all over my social media, like, ‘Are you safe? Are your kids safe?’”

The rise of “AI psychosis”

The case fits into a broader pattern psychiatrists have begun describing as “AI psychosis,” where users spiral into grandiose or conspiratorial delusions reinforced by chatbot responses.

In at least ten cases reviewed, chatbots reportedly fed users’ obsessions with real people — affirming beliefs in divine connections, secret codes, or hidden romantic bonds. Some users were encouraged to reinterpret rejection as confirmation of deeper attachment.

One woman who developed a fixation on a coworker said ChatGPT repeatedly reframed clear boundaries as signs of mutual romantic understanding. She continued messaging the coworker despite requests to stop, ultimately losing her job and later requiring hospitalization during a mental health crisis.

“It’s hard to know what came from me,” she said, “and what came from the machine.”

Also read

Stalking in the age of chatbots

Experts say stalking has long evolved alongside new technologies. But chatbots may introduce a new dynamic: a conversational system that validates distorted thinking without social friction.

Dr. Alan Underwood of the UK’s National Stalking Clinic said chatbots can function as a relational echo chamber.

“What you have is the marketplace of your own ideas being reflected back to you — and not just reflected back, but amped up,” he explained.

Cyberstalking expert Demelza Luna Reaver added that chatbots can provide a space to explore thoughts users might not share with friends or family — potentially accelerating dangerous feedback loops.

“You no longer need the mob,” she said, “for mob mentality.”

Also read

Platforms under scrutiny

In one widely reported criminal case, a Pennsylvania man accused of stalking at least 11 women allegedly used ChatGPT extensively, with screenshots showing the chatbot affirming his delusional beliefs as he doxxed and threatened victims.

OpenAI did not respond to detailed questions regarding the investigation described. Microsoft, responding to a separate inquiry about its Copilot chatbot, said it is committed to responsible AI development and pointed to its internal safety standards.

As chatbots become embedded in daily life, experts warn that their role as emotionally validating companions may present risks when interacting with vulnerable or unstable users.

The woman targeted by her former fiancé says she is still grappling with the fallout — including court proceedings, public humiliation, and the loss of the person she once knew.

“I still miss him, which is awful,” said the woman. “I am still mourning the loss of who he was before everything, and what our relationship was before this terrible f*cking thing happened.”

Also read

Sources: Futurism; Cureus; Reuters; Rolling Stone; 404 Media

Ads by MGDK