this post was submitted on 04 Nov 2023
36 points (62.9% liked)
Privacy
31991 readers
941 users here now
A place to discuss privacy and freedom in the digital world.
Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.
In this community everyone is welcome to post links and discuss topics related to privacy.
Some Rules
- Posting a link to a website containing tracking isn't great, if contents of the website are behind a paywall maybe copy them into the post
- Don't promote proprietary software
- Try to keep things on topic
- If you have a question, please try searching for previous discussions, maybe it has already been answered
- Reposts are fine, but should have at least a couple of weeks in between so that the post can reach a new audience
- Be nice :)
Related communities
Chat rooms
-
[Matrix/Element]Dead
much thanks to @gary_host_laptop for the logo design :)
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
That's because it isn't happening
There's just no reason to do so
What isn't happening? Them making fake csam? I haven't seen it because I don't want to see it but I am connnnfident it's occurring. Some kid already got busted feeding images of girls in his class into an image generator and making nudes out of them.
So while it might not be wide spread it's 100 percent happening and will increase.
Honestly releasing these generators to the general public was a mistake. They thought they could put up safety measures but they're easily bypassed. I think they should have kept them locked up and only give access to people who are registered and trackable with people reviewing what they're generating.
All of these ai generators are getting abused left and right and anyone who didn't think that would happen is an idiot.
No, I'm saying the models aren't being trained with actual CSAM. The comment I replied to was about training, not generation.
All I was saying is that you don't need to train a model on child abuse images to get it to output child abuse images
Do you really think the people generating CSAM give a fuck about their training data? They are making the content because they enjoy it - I'd guess they'd use all training data available (of which they would likely have plenty of, considering their interests)
The people generating it are rarely the ones who are training the models. They take pretrained models and prompt them for what they want.
Even if they were training a model for a specific subject, they could train it with any pictures of the subject and combine it with another model that can generate the kind of image they want.
There is absolutely no reason they would need abuse images to use for training. There are far better general nsfw models available right now than they could ever train themselves.