A recent survey found 72% of teens have used AI companions. But a Colorado family say using them can end in tragedy.

The Social Media Victims Law Center has filed three lawsuits against chatbot platform Character.AI. Two of them are in Colorado. The suits are filed on behalf of children they say died by suicide or were sexually abused after interactions with the platform.

Character.AI is an AI chatbot service that allows users to interact with multiple different AI characters.

According to the lawsuit, the app has been downloaded more than 10 million times. And recently, it was rated as safe for children 12 and up by Google and Apple. Today that rating is "Teen" in Google Play and "17+" in Apple's App Store. But the mother of a Thornton 13-year-old who took her own life after using the

See Full Page