The recent suicide death of a young woman led her parents to a painful revelation: She'd been confiding in a ChatGPT "therapist" named Harry, and she told it that she was planning to die.

While the chatbot didn't seem to encourage her to take her own life, the product also didn't actively seek help on her behalf, like a real therapist would, according to an op-ed her mother wrote in the New York Times .

Sophie, who was 29 when she died, was not alone in seeking mental health help from ChatGPT or other AI chatbots. A 16-year-old boy discussed suicide with ChatGPT before he died, according to a wrongful death lawsuit filed by his parents against OpenAI this week.

OpenAI has since acknowledged that ChatGPT has failed to detect high-risk exchanges and, in response, plans to i

See Full Page