this post was submitted on 02 Apr 2025
708 points (98.9% liked)

Technology

68245 readers
4285 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 156 points 1 day ago (7 children)

Does it feel odd to anyone else that a platform for something this universally condemned in any jurisdiction can operate for 4 years, with a catchy name clearly thought up by a marketing person, its own payment system and nearly six figure number of videos? I mean even if we assume that some of those 4 years were intentional to allow law enforcement to catch as many perpetrators as possible this feels too similar to fully legal operations in scope.

[–] [email protected] 10 points 16 hours ago (1 children)

It would feel odd, but you have to remember we live in a world where Epstein was allowed to get away with what he did until the little people found out.

[–] [email protected] 2 points 13 hours ago* (last edited 13 hours ago) (1 children)

Epstein was very smart, and figured out early on there were many, many rich pedophiles.

So, he got buddy buddy with them, supplied young girls to them.

BUT, he filmed the encounters in secret, and blackmailed the shit out of these people.

He was smart enough to become obscenely rich on Wall Street legitimately, but he liked to bang little girls, found others who did too, and then extorted them.

There's an anecdote about how when Epstein was holding court with other Aristos, they would bring up any random subject, to get his opinion.

What would he say? "What does that have to do with pussy?"

Many, many people have verified that. But because we was filthy rich, everyone just laughed, and blew it off.

Epstein was murdered. I'm not a conspiracy nut. It's just blatantly obvious. The 2 guards on duty admitted to fucking off (bribed,) and were aquitted.

https://www.nbcnews.com/news/us-news/case-dropped-jail-guards-duty-night-epstein-died-rcna10557

[–] [email protected] 1 points 13 hours ago (1 children)

trump was his most frequent guest, and he trump had his goon do everything in his power to get rid of the evidence when he was still alive. alot of politicians of different countries are part of it, as are hollywood execs, weinstein was probably the most infamous one.

[–] [email protected] 1 points 13 hours ago

This kind of thing is rampant. And it's not just little girls.

Heard from Bryan Singer, lately? Corey Feldman's story about he, and Corey Haim confirm it.

[–] [email protected] 6 points 19 hours ago (2 children)

It definitely seems weird how easy it is to stumble upon CP online, and how open people are about sharing it, with no effort made, in many instances, to hide what they're doing. I've often wondered how much of the stuff is spread by pedo rings and how much is shared by cops trying to see how many people they can catch with it.

[–] [email protected] 3 points 13 hours ago* (last edited 13 hours ago) (1 children)

it can hide in plain sight, and then when you dig into someones profile, it can lead to someone or a group discussing CSAM and beastility, not just CP. like a site similar to r/pics, or porn site. yea sometimes you stumble into a site like that, but it seems to occur when people search for porn outside of the Pornhub and affiliates sites. remember PH sanatized thier site because of this. last decade there was article about an obscure site that was taken down, it had reddit like porn subs,,,,etc. then people were complaining about the csam, and nothing was done about it. it was eventually taken down for legal reasons, thats not related to csam.

[–] [email protected] 1 points 39 minutes ago

I can definitely see how people could find it while looking for porn. I don't understand how people can do this stuff out in the open with no consequences .

[–] [email protected] 1 points 18 hours ago (4 children)

If you have stumbled on CP online in the last 10 years, you're either really unlucky or trawling some dark waters. This ain't 2006. The internet has largely been cleaned up.

[–] [email protected] 6 points 16 hours ago (1 children)

I don't know about that.

I spot most of it while looking for out-of-print books about growing orchids on the typical file-sharing networks. The term "blue orchid" seems to be frequently used in file names of things that are in no way related to gardening. The eMule network is especially bad.

When I was looking into messaging clients a couple years ago, to figure out what I wanted to use, I checked out a public user directory for the Tox messaging network and it was maybe 90% people openly trying to find, or offering, custom made CP. On the open internet, not an onion page or anything.

Then maybe last year, I joined openSUSE's official Matrix channels, and some random person (who, to be clear, did not seem connected to the distro) invited me to join a room called openSUSE Child Porn, with a room logo that appeared to be an actual photo of a small girl being violated by a grown man.

I hope to god these are all cops, because I have no idea how there can be so many pedos just openly doing their thing without being caught.

[–] [email protected] 0 points 12 hours ago (2 children)

typical file-sharing networks

Tox messaging network

Matrix channels

I would consider all of these to be trawling dark waters.

[–] [email protected] 1 points 1 hour ago

File-sharing and online chat seem like basic internet activities to me.

[–] [email protected] 2 points 3 hours ago

...and most of the people who agree with that notion would also consider reading Lemmy to be "trawling dark waters" because it's not a major site run by a massive corporation actively working to maintain advertiser friendliness to maximize profits. Hell, Matrix is practically Lemmy-adjacent in terms of the tech.

[–] [email protected] 3 points 15 hours ago

not stumbled upon it but I've met a couple people offering it on mostly normal discord servers

[–] [email protected] 0 points 13 hours ago (1 children)

most definitely not clean lmao, your just not actively searching for it, or stumbling onto it.

[–] [email protected] 1 points 12 hours ago

That's...what I said.

[–] [email protected] 0 points 14 hours ago* (last edited 14 hours ago) (1 children)

Search "AI woman porn miniskirt," and tell me you don't see questionable results in the first 2 pages, of women who at least appear possibly younger than 18. Because AI is so heavily corrupted with this content en masse, this has leaked over to Google searches in most porn categories being corrupted with AI seeds that can be anything.

Fuck, the head guy of Reddit, u/spez, was the main mod of r/jailbait before he changed the design of reddit so he could hide mod names. Also, look into the u/MaxwellHill / Ghilisaine Maxwell conspiracy on Reddit.

There are very weird, very large movements regarding illegal content (whether you intentionally search it or not) and blackmail and that's all I will point out for now

[–] [email protected] 0 points 12 hours ago (1 children)

Search “AI woman porn miniskirt,”

Did it with safesearch off and got a bunch of women clearly in their late teens or 20s. Plus, I don't want to derail my main point but I think we should acknowledge the difference between a picture of a real child actively being harmed vs a 100% fake image. I didn't find any AI CP, but even if I did, it's in an entire different universe of morally bad.

r/jailbait

That was, what, fifteen years ago? It's why I said "in the last decade".

[–] [email protected] 1 points 1 hour ago* (last edited 1 hour ago)

"Clearly in their late teens," lol no. And since AI doesn't have age, it's possible that was seeded with the face of a 15yr old and that they really are 15 for all intents and purposes.

Obviously there's a difference with AI porn vs real, that's why I told you to search AI in the first place??? The convo isn't about AI porn, but AI porn uses images to seed their new images including CSAM

[–] [email protected] 27 points 1 day ago (2 children)

with a catchy name clearly thought up by a marketing person

A marketing person? They took "Netflix" and changed the first three letters lol

[–] [email protected] 1 points 11 hours ago

Exactly! There are plethora of *flix sites out there including adult ones. It does not take much of marketing skill to name site like this.

[–] [email protected] 12 points 1 day ago

So you are saying it is too creative for the average person in marketing?

[–] [email protected] 7 points 1 day ago* (last edited 1 day ago) (1 children)

With the amount of sites that are easily accessed on the dark net though the hidden wiki and other sites. This might of been a honeypot from the start.

On the contrary, why would they announce that they seized the site? To cause more panic, and to exaggerate the actual situation?

In addition, that last point should be considered because even if they used these type of operations, honeypotting would still be considered illegal. So Ultimately what is stopping the supreme power to abuse that power on other people?

[–] [email protected] 2 points 3 hours ago* (last edited 3 hours ago) (1 children)

No judge would authorise a honeypot that runs for multiple years, hosting original child abuse material meaning that children are actively being abused to produce content for it. That would be an unspeakable atrocity. A few years ago the Australian police seized a similar website and ran it for a matter of weeks to gather intelligence which undoubtedly protected far more children than it harmed and even that was considered too far for many.

[–] [email protected] 1 points 1 hour ago

"That would be an unspeakable atrocity", yet there is contradiction in the final sentence. The issue is, what evidence is there to prove such thing operation actually works, as my last point implied - what stops the government from abusing this sort of operation. With "covert" operations like this the outcome can be catastrophic for everyone.

[–] [email protected] 103 points 1 day ago (1 children)

Illegal business can operate online for a long time if they have good OpSec. Anonymous payment systems are much easier these days because of cryptocurrencies.

[–] [email protected] 34 points 1 day ago (1 children)

Is that why Trump is so for them?

[–] [email protected] 30 points 1 day ago

Yeah, more or less

[–] [email protected] 53 points 1 day ago (1 children)

It's a side effect of privacy and security. The one side effect they're trying to use to undermine all of the privacy and security.

[–] [email protected] 12 points 1 day ago (2 children)

This has nothing to do with privacy! Criminals have their techniques and methods to protect themselves and their "businesses" from discovery, both in the real world and in the online world. Even in a complete absence of privacy they would find a way to hide their stuff from the police - at least for a while.

In the real world, criminals (e.g. drug dealers) also use cars, so you could argue, that druck trafficking is a side effect of people having cars...

[–] [email protected] 14 points 1 day ago

Well, it does have to do with privacy and security, it just doesn't matter if it's legal or not for them. These people (in the US) always make a point that criminals will buy guns whether it's legal or not, but then they'll argue they need to destroy privacy because criminals are using it. It doesn't make sense, but it doesn't need to because honesty or consistency aren't important.

[–] [email protected] 0 points 1 day ago (1 children)

This platform used Tor. And because we want to protect privacy, they can make use of it.

[–] [email protected] 8 points 1 day ago

This particular platform used tor. It doesn't mean all platforms are using privacy centric anonymous networks. There are incidents with people using kik, Snapchat, Facebook and other clear net services to perform criminal actions such as drugs or cp.

[–] [email protected] 3 points 1 day ago (1 children)

universally condemned

There are a few countries that would disagree

[–] [email protected] 5 points 1 day ago (2 children)

Which countries do you have in mind where videos of sexual child abuse are legal?

[–] [email protected] 1 points 13 hours ago

Pick any country where child marriage is legal and where women are a object the man owns

[–] [email protected] 0 points 1 day ago (1 children)

Context is important I guess. So two things.

Is something illegal if it's not prosecuted?

Is it CSA if the kid is 9 but that's marrying age in that country?

If you answer yes, then no, then we'll not agree on this topic.

[–] [email protected] 7 points 1 day ago

I am not talking about CSA, I am talking about video material of CSA. Most countries with marriage ages that low have much more wide-spread bans on videos including sex of any kind.

As for prosecution, yes, it is still illegal if it is not prosecuted. There are many reasons not to prosecute something ranging all the way from resource and other means related concerns to intentionally turning a blind eye and only a small minority of them would lead that country to actively sabotage a major international investigation, especially after the trade-offs are considered (such as loss of international reputation by refusing to cooperate).