On Wednesday, Google announced a partnership with StopNCII.org to combat the spread of non-consensual intimate imagery (NCII), the company announced today. Over the next few months, Google will start using StopNCII’s hashes to proactively identify nonconsensual images in search results and remove them. Hashes are algorithmically-generated unique identifiers that allow services to identify and block imagery flagged as abuse without sharing or storing the actual source. StopNII says it uses PDQ for images and MD5 for videos.

As Bloomberg points out, Google has been called out for being slower than others in the industry to take this approach and its blog post seemed to acknowledge that. “We have also heard from survivors and advocates that given the scale of the open web, there’s more to be

See Full Page