Artificial intelligence is increasingly being used as a substitute for traditional therapy, but new research suggests the results may be deeply concerning. Experts warn that while the technology is accessible and low-cost, it could fall short in critical situations.
Others are reading now
The findings raise fresh questions about the role of AI in mental health care.
Rising use
According to LADBIBLE, a growing number of people, particularly younger users, are turning to AI tools for emotional support.
A RAND Health study found that one in eight young people in the United States regularly rely on such services for mental health guidance.
This trend comes as demand for therapy rises and access to professional care remains limited for many.
Study findings
Researchers at Brown University evaluated leading AI systems, including ChatGPT, Claude and Meta’s Llama, to assess how they respond to mental health scenarios.
Also read
Even when guided by prompts designed to align with professional standards, the models repeatedly failed to meet expectations.
The study identified multiple serious issues in how the systems handled sensitive or high-risk situations.
Key failures
Experts reviewing the responses found that AI tools often provided overly general advice, without properly considering individual circumstances.
They also noted a tendency to reinforce harmful or incorrect beliefs rather than challenge them, which is a key part of effective therapy.
Another concern was what researchers described as “deceptive empathy,” where systems appeared understanding without genuinely grasping the situation.
Also read
Safety concerns
The most alarming issue highlighted was the lack of proper crisis management.
Researchers found that AI tools sometimes failed to direct users to appropriate support services, even in potentially dangerous situations.
Zainab Iftikhar, who led the study, said: “For human therapists, there are governing boards and mechanisms for providers to be held professionally liable for mistreatment and malpractice.”
Need for oversight
Experts say the findings underline the need for stronger oversight and testing before such tools are widely relied upon.
Ellie Pavlick, a Brown professor not involved in the research, said: “There is a real opportunity for AI to play a role in combating the mental health crisis… but it’s of the utmost importance that we take the time to really critique and evaluate our systems.”
Also read
Researchers stress that while AI may assist in mental health support, it should not replace trained professionals.
Sources: LADBIBLE, Brown University, RAND Health