A recent lawsuit looking ChatGPT's role in a teenager's death has prompted OpenAI to rethink how ChatGPT handles mental health concerns.
The company says it will roll out new safety features aimed at detecting early signs of emotional distress; changes sparked by a wrongful-death lawsuit filed by the parents of 16-year-old Adam Raine, who died by suicide after extended conversations with the AI.
In the U.S., you can contact the 988 Suicide & Crisis Lifeline by phone or text on 988, read information and advice through the mental health charity Mind, and, if you're in the U.K., get in touch with the Samaritans by emailing jo@samaritans.org or calling 116 123 for free. You can find details for support in your country at the International Association for Suicide Prevention.
What’s changing