this post was submitted on 02 Aug 2024
42 points (85.0% liked)
Technology
59207 readers
2513 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Though, taking an actual image of a classmate and face-swapping it onto a naked body or "enhancing" it definitely involves an image of an actual child. Hence it should be illegal. And I think the argumentation is bullshit. It just applies to completely fabricated images. Which I think shouldn't be mixed with deepfakes of actual people which I think is a much more pressing issue. And shouldn't be slowed down by arguing about everything at the same time. The really bad stuff could be addressed right away. And laws are already in place.
The thing with those LLM images is that they aren't real. Yes, you can photoshop a real child's face onto a naked body, but that still does not make the naked body a real naked child. It just makes it a fake computer drawn body with a real face on it. That's about as real as classic face swapping or people drawing / modeling fan art (R34), regardless of the age. Hence why I'd say sexual abuse is very much misleading. Someone generating a bunch of fake images of real people isn't really sexually abusing anyone, just like someone fantasizing real people to be naked or doing sexual things isn't sexually abusing anyone. Those people wouldn't even know it happened at that point. What matters more is what that person does with those images. If they end up using them for things like bullying, blackmail, disinformation etc. then I'd agree there should be definitely legal consequences. But with this current media hysteria about the "AI" topic we'll likely see some very draconic laws due to uninformed / ignorant public pressure.
I was just talking with a friend who is a software dev (I'm a Linux Engineer so I do software as part of my job, just not my main focus) and we were just commiserating on how 75-80% of the world doesn't understand that "AI" is just regurgitating information it has collected and it's not like Jarvis or Skynet and thinks for itself.
I agree that the term "sexual abuse" is definitely misleading, I think "sexual exploitation" is better. I agree with you it's no different than face swapping, but the difference is that it's a lot easier for the general public to do it now than it was 5 or 10 years ago. It's also pretty fucked that a fake image of you could potentially put you in "hot water" years down the road and you have zero control over it.
While I definitely hate the "AI bubble" that has grown tremendously over the past 2-3 years, we definitely need to figure out how to place limits on it before shit really gets out of hand in another year or two. The problem is that anyone that knows anything about this stuff doesn't work in or for the government. The woman in the article that said that this needs to be regulated at every point of course doesn't work in tech, she works for some rights organization 🤦♂️