this post was submitted on 04 Nov 2023
36 points (62.9% liked)
Privacy
32442 readers
1016 users here now
A place to discuss privacy and freedom in the digital world.
Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.
In this community everyone is welcome to post links and discuss topics related to privacy.
Some Rules
- Posting a link to a website containing tracking isn't great, if contents of the website are behind a paywall maybe copy them into the post
- Don't promote proprietary software
- Try to keep things on topic
- If you have a question, please try searching for previous discussions, maybe it has already been answered
- Reposts are fine, but should have at least a couple of weeks in between so that the post can reach a new audience
- Be nice :)
Related communities
much thanks to @gary_host_laptop for the logo design :)
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
There is no such thing.
God dammit, the entire point of calling it CSAM is to distinguish photographic evidence of child rape from made-up images that make people feel icky.
If you want them treated the same, legally - go nuts. Have that argument. But stop treating the two as the same thing, and fucking up clear discussion of the worst thing on the internet.
You can't generate assault. It is impossible to abuse children who do not exist.
Did nobody in this comment section read the video at all?
The only case mentioned by this video is a case where highschool students distributed (counterfeit) sexually explicit images of their classmates which had been generated by an AI model.
I don't know if it meets the definition of CSAM because the events depicted in the images are fictional, but the subjects are real.
These children do exist, some have doubtlessly been traumatized by this. This crime has victims.
I think a lot of people are arguing that the models which are used to generate these types of content are trained on literal CSAM. So it's like CSAM with extra steps.
Those people are morons.
In most (all?) countries no such distinction is made, the material is illegal all the same.