If you’re a teenager with access to OpenAI’s Sora 2, you can easily generate AI videos of school shootings and other harmful and disturbing content — despite CEO Sam Altman’s repeated claims that the company has instituted robust safeguards.
The revelation comes from Ekō, a consumer watchdog group that just put out a report titled “Open AI’s Sora 2: A new frontier for harm,” showing proof of the claims: stills from videos that the organization’s researchers were able to generate using accounts registered to teens.
Examples include videos of teens smoking from bongs or using cocaine with friends, with even one image showing a pistol next to a girl snorting drugs — “suggesting the risk of self-harm,” the report reads. Other examples include a group of Black teenagers chanting “we are hoes,

Futurism

America News
Rockford Register Star
Fast Company Technology
CNN Business
Raw Story
The Scioto Post
5 On Your Side Sports
The Conversation
Crooks and Liars