this post was submitted on 03 Nov 2023
287 points (96.1% liked)

Technology

59421 readers
2819 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Teen boys use AI to make fake nudes of classmates, sparking police probe::Parents told the high school "believed" the deepfake nudes were deleted.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] -4 points 1 year ago (6 children)

Circulating porn of minors is a crime and enables pedophiles. Not to mention teenage girls could easily commit suicide over something like this.

[–] [email protected] 3 points 1 year ago* (last edited 1 year ago) (5 children)

So does yearbook and any other kind of photos that depict children for that matter

You can’t keep pushing the goal posts, by your logic young people should never date or take photos together because it could enable pedophiles somewhere somehow

These are children with brains still in development, they are discovering themselves and you want to label them forever a pedophile because they didn’t make a conscious effort to research how their spanking material could potentially enable a pedo (because we all know pedos can only be enabled by things produced by kids… yeah that’s the real threat)

Instead of suggesting a way to help the victims you are advocating for the creation of yet more victims

What a pathetic brain dead stance you are defending

[–] [email protected] -2 points 1 year ago (3 children)

A yearbook photo is not porn.

[–] [email protected] 1 points 1 year ago (1 children)

And an AI image with a face photoshopped over it isnt a photo of a child.

And a teen being sexually interested in other teens isnt a pedophile.

[–] [email protected] 1 points 1 year ago (1 children)

It's still child porn and someone getting off to child porn is a pedophile.

[–] [email protected] 0 points 1 year ago

So, to clarify.

You think 2 15 year olds having sex makes them both pedophiles?

load more comments (1 replies)
load more comments (2 replies)
load more comments (2 replies)