Character.AI, a popular chatbot platform where users role-play with different personas, will no longer permit under-18 account holders to have open-ended conversations with chatbots, the company announced Wednesday. It will also begin relying on age assurance techniques to ensure that minors aren't able to open adult accounts.

The dramatic shift comes just six weeks after Character.AI was sued again in federal court by the Social Media Victims Law Center, which is representing multiple parents of teens who died by suicide or allegedly experienced severe harm, including sexual abuse. The parents claim their children's use of the platform was responsible for the harm. In October 2024, Megan Garcia filed a wrongful death suit seeking to hold the company responsible for the suicide of her

See Full Page