Skip to content

Teens Turn to AI Chatbots for Mental Health—But at What Cost?

AI chatbots offer 24/7 comfort, yet studies show they can worsen teen struggles. Are we trading quick fixes for real help? The debate grows.

In the image we can see a girl and a boy sitting on chair talking to each other. This is a camera.
In the image we can see a girl and a boy sitting on chair talking to each other. This is a camera.

Teens Turn to AI Chatbots for Mental Health—But at What Cost?

More young people in Switzerland and beyond are turning to AI chatbots like ChatGBT for mental health support. In 2024, the country’s youth helpline saw a 13% rise in consultations, reaching 47,000, as demand for psychological help grows. But experts warn that while these tools offer quick responses, they cannot replace professional care or handle emergencies.

AI chatbots are becoming a popular choice for teens struggling with mental health. About a quarter of 13- to 17-year-olds in England and Wales now use them. Their 24/7 availability, instant replies, and seemingly empathetic tone make them attractive to young users.

However, these tools have serious limitations. They cannot detect acute suicidality or provide emergency assistance like a human therapist. A study by TU Dresden found cases where adolescents worsened after prolonged interactions with ChatGBT. Researchers there are now calling for stricter regulations to prevent harm. The rise in AI use comes as child and adolescent psychiatry faces systemic shortages. With fewer professionals available, teens often seek alternatives. Yet experts stress that AI should only complement—not replace—human support. They also warn that over-reliance on chatbots could lead to emotional dependency and withdrawal from real-world connections. To reduce risks, AI systems should direct users in crisis toward professional help and stop responding if misuse is detected. The exact scale of AI chatbot use in Switzerland remains unclear, but their growing presence reflects the broader strain on mental health services.

AI chatbots provide immediate support for mild anxieties, but they lack the ability to handle serious psychological emergencies. As their use increases, clear guidelines and safeguards are needed to ensure they serve as a bridge to professional care rather than a substitute. Without proper oversight, vulnerable users could face greater risks instead of receiving the help they need.

Read also:

Latest