ChatGPT guardrails for teens and people in emotional distress will roll out by the end of the year, OpenAI promised Tuesday.

Why it matters: Stories about ChatGPT encouraging suicide or murder or failing to appropriately intervene have been accumulating recently, and people close to those harmed are blaming or suing OpenAI. • ChatGPT currently directs users expressing suicidal intent to crisis hotlines. OpenAI says it does not currently refer self-harm cases to law enforcement, citing privacy concerns.

The big picture: Last week the parents of a 16-year-old Californian who killed himself last spring sued OpenAI, suggesting that the company is responsible for their son's death. • Also last week, The Wall Street Journal reported that a 56-year-old man killed his mother and himself afte

See Full Page