Facing the unfamiliar numbness of grief after her grandfather's death, 18-year-old student Melissa from north London didn't know where to turn. Unsure if her lack of feeling was normal and hesitant to burden friends, she found an immediate, judgement-free listener in an unlikely place: the AI chatbot ChatGPT.
The 2am Therapist: Why Young People Are Logging On
Melissa's experience is far from unique. According to a major new report from the Youth Endowment Fund (YEF), a staggering one in four young people have used AI chatbots for mental health support. The survey of 11,000 children aged 13 to 17 in England and Wales paints a picture of a generation seeking help wherever they can find it.
The data reveals that a quarter of teenagers reported a diagnosis of at least one mental health or neurodevelopmental condition, such as depression. A further 21% suspected they had a condition but were undiagnosed. Alarmingly, more than half (53%) of all teens used some form of online mental health support in the past year, with AI chatbots accounting for 25% of that figure.
For Melissa, who calls the chatbot 'Chat', the appeal is its constant, pressure-free availability. "If I want advice at 2am, it's easier to go to Chat than wake them up," she told Metro. "It reduces the awkwardness and pressure of conversations, so you feel more confident as there are no consequences." She contrasts this with being asked to wait up to a month for therapy by one youth charity.
A System Under Strain and a Psychologically Compelling Alternative
The YEF's findings highlight a crisis in traditional mental health provision. Services are buckling under soaring demand, with monthly referrals for young people's mental health tripling from 40,000 to 120,000 last year. Therapists often see up to 25 clients weekly.
Dr Michael Swift, a spokesperson for the British Psychological Society, was unsurprised by the trend. He explained that AI offers a uniquely appealing service for adolescents. "Adolescents are sensitive to perceived criticism, rejection and authority and AI offers a non-reactive, endlessly patient listener that never appears bored or disappointed," he said.
While Dr Swift doubts current chatbots can handle delicate issues like a trained professional, he acknowledges their pull: "That doesn't make it AI therapy, but it does make it a psychologically compelling first port of call." He compares it to diaries or advice columns for previous generations, but in a more interactive form.
Access Without Judgement: The Double-Edged Sword of AI Support
The report also found that youngsters who have endured serious violence, as either victim or perpetrator, are far more likely to seek help via AI. Researcher Hanna Jones, who studies youth violence and mental health, explains this is often due to a profound distrust of official systems.
"They've been told that we're not here to support you, we're here to punish you," Jones states. "So of course, they're going to go to something that accepts them and doesn't judge them."
This mirrors Melissa's own reasoning. The non-judgemental, accessible nature of the technology is its core strength. However, critics strongly warn that unregulated AI tools risk exacerbating mental health conditions and could point vulnerable users in dangerous directions.
Dr Swift's primary concern isn't the vision of robot therapists, but that teens will seek support in isolation. He urges a lesson from the trend: "The opportunity is to recognise what this reveals: young people want support that is accessible, responsive and non-judgemental – qualities that evidence-based mental health services must continue to prioritise."
As for Melissa, she believes her entire friendship group is already using this technology. "If I were to line up my friends and tell them, open their phone, I promise you," she says, "every single one has ChatGPT."