this post was submitted on 03 Nov 2023
123 points (86.8% liked)

Technology

59374 readers
6264 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Westfield is but one example of an issue all school districts are grappling with as the omnipresence of technology — including artificial intelligence — impacts students' lives, the district's superintendent Raymond González said in a statement.

all 42 comments
sorted by: hot top controversial new old
[–] [email protected] 64 points 1 year ago (4 children)

Regulation isn't going to stop this from happening, especially since there's a company who is going to build fleets of AI processing barges to float in international waters to bypass this exact type of regulation.

[–] [email protected] 21 points 1 year ago

Not to mention it's already quite easy to run a local generative AI on a modest gaming PC, and it's only going to get easier.

[–] [email protected] 9 points 1 year ago* (last edited 1 year ago) (2 children)

barges to float in international waters

International waters don't bring you money.

Regulation needs to grab them by the money. When making these pics is a crime and that company is aiding to a crime, then an authority can take away all their money or even forbid them business at all. Then they are going to start to think.

But it needs legislation that really means it, and leaves no loopholes.

[–] [email protected] 3 points 1 year ago

The loopholes are other countries

[–] [email protected] 3 points 1 year ago

You seem to forget that VPNs and crypto exist, so no country can stop it from happening.

[–] [email protected] 3 points 1 year ago

Is there really or is that an idea? That sounds terrifying if so.

[–] [email protected] -1 points 1 year ago

And then that fleet will sell access to children? Isn’t that a bit disconnected from the specifics of the topic?

[–] [email protected] 55 points 1 year ago (2 children)

I feel like they said the same thing about Photoshop.

[–] [email protected] 42 points 1 year ago* (last edited 1 year ago) (2 children)

Photoshop was always something that required skill, and a computer to run it, and a copy of a paid program.

This stuff does not need a lot of those hurdles. It’s all about ease and how it’s usable on your pocket computer that you and all your classmates have with you all the time.

Your thought is still a fair one to have. But there are big differences between what was and this new stuff. In the past you woulda needed a ton more skill and the alignment of a bunch of things to casually generate fake nudes like the ones covered by this article.

[–] [email protected] 30 points 1 year ago (2 children)

and a copy of a paid program.

I pirated photoshop when I was 13

[–] [email protected] 13 points 1 year ago* (last edited 1 year ago)

Sweet. Me too. In the 90s. This is partially where I draw my understanding of the situation from.

Specifically:

  • The idea that piracy of professional software isn’t as casual as phone apps or web apps.
  • The fact that it’s paid software that is professional software with a learning curve.
[–] [email protected] 8 points 1 year ago

What is warez?

[–] [email protected] 13 points 1 year ago (2 children)

You need a hell of a lot more computer power to run a llm than you do photoshop.

[–] [email protected] 17 points 1 year ago* (last edited 1 year ago) (1 children)

While true, you can generate AI images with a potato, it just takes longer. For my setup, stable diffusion on my RTX 3060 generates the basic image in around 10 seconds while running on CPU only takes around five minutes, but the result is exactly the same.

[–] [email protected] 2 points 1 year ago (1 children)

Do the files exactly match to their hashes? I wonder if there's a fundamental difference generated by using different hardwares.

[–] [email protected] 6 points 1 year ago (1 children)

AI always generate different outputs for the same input (AI appears to be non-deterministic) so it would be impossible to confirm that exactly.

But I suppose what they mean is they appear to be of the same quality. Taking a longer time does not appear to decrease the quality of the output.

I suppose you could give an AI the same input resetting it after each input and then use statistical models to identify common traits. Then do the same thing on different hardware and run the same statistical analysis and see if there is a difference between group A in group B but as far as I'm aware no one has done this.

In theory hardware shouldn't matter, it's all mathematics basically and one plus one is always equal two, so there shouldn't be any fluctuations.

[–] [email protected] 1 points 1 year ago (1 children)

Yes, I suppose given equal input (model, keyword, seed, etc.) two Stable Diffusion installs should output same images; what I am curious about is whether the hardware configuration (e.g. gpu manufacturers) could result in traceable variations. As abuse of this tech gains prominence, tracing back the producer of a certain synthetic media by the specific hardware combination could become a thing.

[–] [email protected] 2 points 1 year ago* (last edited 1 year ago)

While it could work like bullet forensics, where given access to the gun you can shoot it and compare it to the original bullet, there is no way to look at a generated image and figure out exactly what made it as there are simply way too many variables and random influences. Well, unless the creator is dumb enough to keep the metadata enabled, by default automatic1111 stable diffusion embeds all of it in the file itself as a png comment thingy.

[–] [email protected] 5 points 1 year ago (1 children)

I assume these models are being run on servers.

[–] [email protected] 7 points 1 year ago

Which server run model allows pornography of this type?

[–] [email protected] 4 points 1 year ago

Or unregulated canvas sales and (gasp) artists who paint for money.

[–] [email protected] 41 points 1 year ago* (last edited 1 year ago) (2 children)

"knife stabbings done with [new alloy] knives shows that [new alloy] is a major threat to society, spokesman says, and remind people to pretend that knife stabbings is a completely new thing that didn't exist before"

[–] [email protected] 12 points 1 year ago

Deepfakes are nothing new but the ease of access to said tools these days and its popularity are definitely something to be concerned about. Don't downplay it.

[–] [email protected] 33 points 1 year ago (1 children)

Harassment is wrong whether you're doing it with Stable Diffusion or passing notes with rude cartoons.

[–] [email protected] 11 points 1 year ago* (last edited 1 year ago) (1 children)

The question that this probably comes down to is: "is this a harassment problem or an AI problem?"

[–] [email protected] 12 points 1 year ago

Imagine if word processors or email clients refused to let you write malicious or hostile messages. I don't think that would be an improvement.

[–] [email protected] 29 points 1 year ago (2 children)

Privacy is dead. Society just hasn't realized it yet.

[–] [email protected] 3 points 1 year ago (1 children)
[–] [email protected] 21 points 1 year ago (1 children)

Cameras everywhere, everyone carrying around gps trackers all the time, AI creating realistic duplicates of people's appearance and voice. Tech will increasingly make keeping secrets harder.

[–] [email protected] 0 points 1 year ago

Just saying with GPS on phones, it's actually receive only so it's actually very private.

[–] [email protected] -1 points 1 year ago

Information wants to be free

[–] [email protected] 6 points 1 year ago (1 children)

Look that's nice and all, but it's not going away and it's only going to get worse. The age of fake AI porn is only beginning. Full-on porn videos where you can take a couple of photos of someone and the AI will build a model of the person and insert them into the porn video is coming. Whether this is done for laughs, from embarrassment, or because it's sexy doesn't matter.

This genie is not going back in the bottle. This is only the tip of the iceberg. We are moving towards an age where you will be able to have virtual sex with anyone you want as long as you have a picture or video of them. VR sex and a porn game that can map someone into a character isn't that far away. It really doesn't harm anyone if we quit being such prudes.

[–] [email protected] 6 points 1 year ago (1 children)

It'll end up just being an extension of the "don't tell someone randomly that you have fantasies about them". You keep your AI masturbation habits between you, the AI, and the tech companies and government agencies spying on your masturbation habits.

[–] [email protected] 2 points 1 year ago

I completely agree.