Democracy Dies in Darkness

Exploitive, illegal photos of children found in the data that trains some AI

Stanford researchers found more than 1,000 images of child sexual abuse photos in a prominent database used to train AI tools

December 20, 2023 at 7:00 a.m. EST
FILE - Students walk on the Stanford University campus on March 14, 2019, in Stanford, Calif. Hidden inside the foundation of popular artificial intelligence image-generators are thousands of images of child sexual abuse, according to a new report from the Stanford Internet Observatory that urges technology companies to take action to address a harmful flaw in the technology they built. (AP Photo/Ben Margot, File)
3 min

More than 1,000 images of child sexual abuse have been found in a prominent database used to train artificial intelligence tools, Stanford researchers said Wednesday, highlighting the grim possibility that the material has helped teach AI image generators to create new and realistic fake images of child exploitation.

In a report released by Stanford University’s Internet Observatory, researchers said they found at least 1,008 images of child exploitation in a popular open source database of images, called LAION-5B, that AI image-generating models such as Stable Diffusion rely on to create hyper-realistic photos.