deathbird

joined 2 years ago
[–] [email protected] 13 points 3 days ago (1 children)

Public micro blogging overall is a bane, so yes.

[–] [email protected] 8 points 1 week ago

The difference between clearly documenting features, and hiding or removing them.

[–] [email protected] 4 points 1 week ago

First time I saw a Zoomer do that it hurt my soul.

[–] [email protected] 7 points 3 weeks ago

This is actually a good take. Kids aren't miniature adults, they're kids. They're not helpless or useless, but neither are they fully morally and emotionally developed. They need guidance. Plenty of adults can't responsibly handle internet access. I survived early onilne porn and gore and social media, but it's not like any of it benefited me in a meaningful way.

Some folks have an attitude that's like "I touched hot stoves and I learned better", but that's far from ideal.

[–] [email protected] 12 points 3 weeks ago

To be fair, at least as of this moment his prior post says Google is "manufacturing consent for", not "actively supporting". I believe that the former can be the latter, but is not necessarily the latter.

[–] [email protected] 10 points 4 weeks ago

UBlock asks that you give to the blocklist maintainers.

[–] [email protected] 3 points 1 month ago

I'm actually for the idea of emojis for protocols. Not Bitcoin specifically because I don't think it has long term potential as a deflationary virual asset, but block chain? Sure.

[–] [email protected] 32 points 3 months ago (6 children)

GOG is good too.

[–] [email protected] 7 points 3 months ago (2 children)

Right? Call it what you will, changing terms of sale or use after taking someone's money is wrong.

[–] [email protected] 1 points 3 months ago

There are probably safeguards in place to prevent the creation of CSAM, just like there are for other illegal and offensive things, but determined people work around them.

[–] [email protected] 15 points 4 months ago (2 children)

the AI has to be trained on something first. It has to somehow know what a naked minor looks like. And to do that, well... You need to feed it CSAM.

First of all, not every image of a naked child is CSAM. This is actually been kind of a problem with automated CSAM detection systems triggering false positives on non-sexual images, and getting innocent people into trouble.

But also, AI systems can blend multiple elements together. They don't need CSAM training material to create CSAM, just the individual elements crafted into a prompt sufficient to create the image while avoiding any safeguards.

[–] [email protected] 33 points 4 months ago (8 children)

It would not need to be trained on CP. It would just need to know what human bodies can look like and what sex is.

AIs usually try not to allow certain content to be produced, but it seems people are always finding ways to work around those safeguards.

view more: next ›