CSAM in Stable Diffusion
2023-12-20 18:21:08.063349+01 by Dan Lyke 0 comments
404 Media: Largest Dataset Powering AI Images Removed After Discovery of Child Sexual Abuse Material
The model is a massive part of the AI-ecosystem, used by Google and Stable Diffusion. The removal follows discoveries made by Stanford researchers, who found thousands instances of suspected child sexual abuse material in the dataset.
Washington Post: Exploitive, illegal photos of children found in the data that trains some AI
In a report released by Stanford University’s Internet Observatory, researchers said they found at least 1,008 images of child exploitation in a popular open source database of images, called LAION-5B, that AI image-generating models such as Stable Diffusion rely on to create hyper-realistic photos.