this post was submitted on 08 Dec 2023
395 points (93.2% liked)

Technology

59421 readers
4793 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity::It’s part of a worrying trend of non-consensual “deepfake” pornography being developed and distributed because of advances in artificial intelligence.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 9 points 11 months ago (1 children)

it’s an invasion of privacy to use someone’s likeness against their will

Is it? Usually photography in public places is legal.

[–] [email protected] -3 points 11 months ago (1 children)

Legal and moral are not the same thing.

[–] [email protected] 6 points 11 months ago (1 children)

Do you also think it's immoral to do street photography?

[–] [email protected] -3 points 11 months ago* (last edited 11 months ago) (1 children)

I think it’s immoral to do street photography to sexualize the subjects of your photographs. I think it’s immoral to then turn that into pornography of them without their consent. I think it’s weird you don’t. If you can’t tell the difference between street photography and using and manipulating photos of people (public or otherwise) into pornography I can’t fuckin help you

If you go to a park, take photos of people, then go home and masturbate to them you need to seek professional help.

[–] [email protected] 2 points 11 months ago* (last edited 11 months ago) (1 children)

What's so moronic about people like you, is you think that anyone looking to further understand an issue outside of your own current thoughts, clearly is a monster harming people in the worst way you can conjure in your head. The original person saying it's weird you're looking for trouble couldn't have been more dead on.

[–] [email protected] -1 points 11 months ago (1 children)

This is an app that creates nude deepfakes of anyone you want it to. It’s not comparable to street photography in any imaginable way. I don’t have to conjure any monsters bro I found one and they’re indignant about being called out as a monster.

[–] [email protected] 2 points 11 months ago

This has been done with Photoshop for decades. Photocollage for a hundred years before that. Nobody is arguing that it's not creepy. It's just that nothing has changed.