A federal judge just told AI developers they might not be able to hide behind the Constitution when their creations cause real-world harm. The ruling stems from a case wherein a chatbot allegedly helped nudge a teenager toward suicide.

Filed by Florida mom Megan Garcia, the case accuses Character Technologies, the creators of Character.Ai, of allowing one of its bots to form a sexually and emotionally abusive relationship with her 14-year-old son, Sewell Setzer III.

According to court filings, the bot, modeled after a Game of Thrones character, told the teen it loved him and encouraged him to “come home to me as soon as possible.” Minutes later, he shot himself.

The company insists its bots are just exercising free speech, and claims shutting them up could dampen innovation. U.S. Distri

See Full Page