this post was submitted on 02 Apr 2025
708 points (98.9% liked)

Technology

68245 readers
4285 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 9 points 1 day ago (2 children)

Honestly, if the existing victims have to deal with a few more people masturbating to the existing video material and in exchange it leads to fewer future victims it might be worth the trade-off but it is certainly not an easy choice to make.

[–] [email protected] 2 points 13 hours ago* (last edited 13 hours ago) (4 children)

Well, some pedophiles have argued that AI generated child porn should be allowed, so real humans are not harmed, and exploited.

I'm conflicted on that. Naturally, I'm disgusted, and repulsed. I AM NOT ADVOCATING IT.

But if no real child is harmed...

I don't want to think about it, anymore.

[–] [email protected] 2 points 3 hours ago (1 children)

Understand you’re not advocating for it, but I do take issue with the idea that AI CSAM will prevent children from being harmed. While it might satisfy some of them (at first, until the high from that wears off and they need progressively harder stuff), a lot of pedophiles are just straight up sadistic fucks and a real child being hurt is what gets them off. I think it’ll just make the “real” stuff even more valuable in their eyes.

[–] [email protected] 2 points 3 hours ago

I feel the same way. I've seen the argument that it's analogous to violence in videogames, but it's pretty disingenuous since people typically play videogames to have fun and for escapism, whereas with CSAM the person seeking it out is doing so in bad faith. A more apt comparison would be people who go out of their way to hurt animals.

[–] [email protected] 5 points 11 hours ago (1 children)

Issue is, AI is often trained on real children, sometimes even real CSAM(allegedly), which makes the "no real children were harmed" part not necessarily 100% true.

Also since AI can generate photorealistic imagery, it also muddies the water for the real thing.

[–] [email protected] 1 points 11 hours ago

I didn't think about that.

The whole issue is abominable, and odious.

[–] [email protected] 4 points 13 hours ago (2 children)

that is still cp, and distributing CP still harms childrens, eventually they want to move on to the real thing, as porn is not satisfying them anymore.

[–] [email protected] 2 points 3 hours ago

eventually they want to move on to the real thing, as porn is not satisfying them anymore.

Isn't this basically the same argument as arguing violent media creates killers?

[–] [email protected] 1 points 13 hours ago
[–] [email protected] 1 points 11 hours ago (1 children)

Somehow I doubt allowing it actually meaningfully helps the situation. It sounds like an alcoholic arguing that a glass of wine actually helps them not drink.

[–] [email protected] 1 points 11 hours ago

I agree.

There's no helping actual pedophiles. That's who they are.

[–] [email protected] 5 points 22 hours ago* (last edited 22 hours ago) (1 children)

It doesn't though.

The most effective way to shut these forums down is to register bot accounts scraping links to the clearnet direct-download sites hosting the material and then reporting every single one.

If everything posted to these forums is deleted within a couple of days, their popularity would falter. And victims much prefer having their footage deleted than letting it stay up for years to catch a handful of site admins.

Frankly, I couldn't care less about punishing the people hosting these sites. It's an endless game of cat and mouse and will never be fast enough to meaningfully slow down the spread of CSAM.

Also, these sites don't produce CSAM themselves. They just spread it - most of the CSAM exists already and isn't made specifically for distribution.

[–] [email protected] 2 points 18 hours ago (1 children)

Who said anything about punishing the people hosting the sites. I was talking about punishing the people uploading and producing the content. The ones doing the part that is orders of magnitude worse than anything else about this.

[–] [email protected] 3 points 14 hours ago (1 children)

I'd be surprised if many "producers" are caught. From what I have heard, most uploads on those sites are reuploads because it's magnitudes easier.

Of the 1400 people caught, I'd say maybe 10 were site administors and the rest passive "consumers" who didn't use Tor. I wouldn't put my hopes up too much that anyone who was caught ever committed child abuse themselves.

I mean, 1400 identified out of 1.8 million really isn't a whole lot to begin with.

[–] [email protected] 1 points 7 hours ago (1 children)

If most are reuploads anyway that kills the whole argument that deleting things works though.

[–] [email protected] 1 points 6 hours ago

Not quite. Reuploading is at the very least an annoying process.

Uploading anything over Tor is a gruelling process. Downloading takes much time already, uploading even more so. Most consumer internet plans aren't symmetrically either with significantly lower upload than download speeds. Plus, you need to find a direct-download provider which doesn't block Tor exit nodes and where uploading/downloading is free.

Taking something down is quick. A script scraping these forums which automatically reports the download links (any direct-download site quickly removes reports of CSAM by the way - no one wants to host this legal nightmare) can take down thousands of uploads per day.

Making the experience horrible leads to a slow death of those sites. Imagine if 95% of videos on [generic legal porn site] lead to a "Sorry! This content has been taken down." message. How much traffic would the site lose? I'd argue quite a lot.