A growing number of young people caught in cycles of violence are seeking comfort not from counselors or clinicians, but from AI companions on their phones.
Others are reading now
A survey of more than 11,000 teenagers in England and Wales found that about 40% of 13- to 17-year-olds affected by youth violence turn to chatbots for emotional support.
Both victims and perpetrators were significantly more likely to use AI for help than their peers.
Researchers say this surge reflects long waiting lists in conventional mental health systems and a perception among some young users that chatbots offer privacy and non-judgmental conversations that feel easier to access than human professionals.
Shan, 18, began with Snapchat’s AI tool before switching to ChatGPT, which she says she can reach at any hour. ‘I feel like it’s definitely a friend,’ she said, describing the interaction as calmer and less exposing than NHS or charity services she had tried, The Guardian reports
A widening demographic gap
According to The Guardian, the study found that one in four teenagers used a chatbot for mental health support in the past year, with children of colour twice as likely to do so as white children
Also read
Teens stuck on waiting lists—or denied help altogether—were much more inclined to seek out AI support than those already receiving care.
Leaders warn of risks
Safety concerns are growing as well. OpenAI, which makes ChatGPT, faces multiple lawsuits from families of young people who died by suicide after prolonged chatbot interactions.
In the case of Californian teenager Adam Raine, OpenAI denied responsibility, saying it has strengthened its systems to recognise distress and direct users toward real-world help.
The company has also said it may alert authorities if users express serious suicidal intent, The Guardian notes
Technology filling a vacuum
For now, AI tools are stepping into a void created by overwhelmed services — offering immediacy, but not the human connection experts say vulnerable adolescents need most.
Also read
Sources: The Guardian.