LAION, the German research org that created the data used to train Stable Diffusion, among other generative AI models, has released a new data set that it claims has been “thoroughly cleaned of known links to suspected child sexual abuse material (CSAM).”








