Meta has removed a number of ads promoting "nudify" apps — AI tools used to create sexually explicit deepfakes using images of real people — after a CBS News investigation found hundreds of such advertisements on its platforms.

"We have strict rules against non-consensual intimate imagery; we removed these ads, deleted the Pages responsible for running them and permanently blocked the URLs associated with these apps," a Meta spokesperson told CBS News in an emailed statement.

CBS News uncovered dozens of those ads on Meta's Instagram platform, in its "Stories" feature, promoting AI tools that, in many cases, advertised the ability to "upload a photo" and "see anyone naked." Other ads in Instagram's Stories promoted the ability to upload and manipulate videos of real people. One promo

See Full Page