A teen told a Character AI chatbot 55 times that she was feeling suicidal. Her parents say the chatbot never provided resources for her to get help. They are one of at least six families suing the company.
Character AI pushes dangerous content to kids, parents and researchers say
CBS News21 hrs ago79


Raw Story
ABC News
Law & Crime
Associated Press Top News
Newsday
KTRE 9 News
ABC30 Fresno World
@MSNBC Video