- cross-posted to:
- technology@lemmit.online
- cross-posted to:
- technology@lemmit.online
AI-created child sexual abuse images ‘threaten to overwhelm internet’::Internet Watch Foundation finds 3,000 AI-made abuse images breaking UK law
AI-created child sexual abuse images ‘threaten to overwhelm internet’::Internet Watch Foundation finds 3,000 AI-made abuse images breaking UK law
Also the models were trained on real images, every image these tools create are directly related to the rape of thousands or even tens of thousands of children.
Real or not these images came from real children that were raped in the worst ways imaginable
I don’t think that’s the case
Why not?
You don’t need the exact content you want in order to train a model (Lora) for SD. If you train on naked adults, and clothed kids, it can make some gross shit. And there are a lot more of those safe pictures out there to use for training. I’d bet my left leg that these models were trained that way.
Why? If these people have access to these images why would you bet that they don’t use them?
There are dark web sites that have huge sets of CSAM, why would these people not use that? What are you betting on? Their morals?