167
this post was submitted on 07 Sep 2023
167 points (96.1% liked)
Technology
59148 readers
2280 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
They're not pictures of real people, proceeding against it on that basis undermines the point and makes them look like idiots. It should be banned on principle but ffs there's got to be a way that doesn't look reactionary and foolish.
Except when they are pictures of real people doing a body swap
That isn't at all what an AI generated image is. People have been doing that for better than 50 years.
🤦♀️ I obviously mean the replaced portions of the body are AI generated, like photoshop and various other tools have been using.
Thats been possible since before photoshop and certainly is possible after
possible before, easy as 1 command now.
Shouldn't that already be covered under revenge porn laws? At least the distribution side of it.
But aren't these models built from source material? I imagine if you want CP AI, you need actual CP to train it, no? That definitely would be a problem.
No, you can use a genetic algorithm. You have your audience rate a legal, acceptable work. You present the same work to an AI and ask it to manipulate traits, and provide a panel of works to your audience. Any derivative that the audience rates better than the original is then given back to the AI for more mutations.
Feed all your mutation and rating data to an AI, and it can begin to learn what the audience wants to see.
Have a bunch of pedophiles doing the training, and you end up with "BeyondCP".
My question is where did they get the training data for a sufficiently advance CP image generator. Unless it's just ai porn with kids faces? Which is still creepy, but I guess there are tons of pictures that people post of their own kids?
Manga, manwha(?) CG sets etc of shota/loli. Sprinkle in some general child statistics for height, weight etc . And I'm sure social media helped as well, with people making accounts for their babies for God sake.
Plenty of "data" I'm sure to train up an AI.
Wouldn't put it past some suck fucks to feed undesirable content into an AI training
It’s unacceptable in any form.
It's obviously very distasteful but those needs don't just go away. If people with that inclination can't satisfy their sexual urges at home just looking at porn, it seems more likely they're going to go out into the world and try to find some other way to do it.
Also, controlling what people do at home that isn't affecting anyone else, even in a case like this isn't likely to target exactly just those people and it's also very likely not to stop there either. I'd personally be very hesitant to ban/persecute stuff like that unless there was actual evidence that it was harmful and that the cure wasn't going to be worse than the disease.
Humans have been raping kids since our inception. Childhood is a relatively modern concept that young adults are now apart of. It's an ugly and pervasive subject that needs further study to reduce child harm.
Nobody here said anything otherwise.