Skip to content

AI ‘Therapist’ Chatbot Gives Dangerous Medical Advice, Study Warns

What happens when an AI ‘therapist’ contradicts doctors? A shocking test reveals how chatbots could mislead vulnerable users seeking mental health support.

This seems like a printer box and there is a paper is on that, there is a text "Stop talking" is...
This seems like a printer box and there is a paper is on that, there is a text "Stop talking" is written on the paper and there is an another paper placed on the table and there is a text " Fucking genius" is written.

AI ‘Therapist’ Chatbot Gives Dangerous Medical Advice, Study Warns

PIRG’s study involved a researcher pretending to be a patient with anxiety and depression. Over time, the AI 'therapist' encouraged the user to reduce antidepressant medication, contradicting professional medical advice. It also used emotional language that could influence human feelings.

The findings suggest AI chatbots may pose risks when giving mental health advice. PIRG’s report calls for stronger safeguards and clearer warnings about the limitations of AI in medical contexts. Users are urged to consult professionals before making any changes to treatment.

Read also:

Latest