this post was submitted on 08 Jun 2024
361 points (97.9% liked)

Technology

58137 readers
4397 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] -5 points 3 months ago* (last edited 3 months ago) (3 children)

Seems a reasonable request. You are creating a tool with the potential to be used as a weapon, you must be able to guarantee it won't be used as such. Power is nothing without control.

[–] [email protected] 5 points 3 months ago (2 children)

How is that reasonable? Almost anything could be potentially used as a weapon, or to aid in crime.

[–] [email protected] 1 points 3 months ago (2 children)

This is for models that cost 100 million dollars to train. Not all things are the same and most things that can do serious damage to big chunks of population are regulated. Cars are regulated, firearms are regulated, access to drugs is regulated. Even internet access is super controlled. I don't see how you can say AI should not be regulated.

[–] [email protected] 1 points 3 months ago

big chunks of population are regulated ...

This is appeal to authority, the ligitimicy, correctness and, "goodness" of the items you've listed are in constant flux and under heavy dibate.

firearms are regulated ... Even internet access is super controlled

These two in particular are a powder keg. US politics likes the former (a lot) and, lemmy is attracted to the latter.

[–] [email protected] 1 points 3 months ago (1 children)

AI is already regulated. Just because something is new (to the public) does not mean that laws don't apply to it. We don't need regulation for the sake of regulation.

There's a lot of AI regulation that may become necessary one day. For example, maybe we should have a right to an AI assistant, like there is a right to legal counsel today. Should we be thinking about the minimum compute to assign to public defense AIs?

This is for models that cost 100 million dollars to train.

Or take a certain amount of compute. Right now, this covers no models. Between progress and inflation, it will eventually cover all models. At some point between no and then, the makers of such laws will be cursed as AI illiterate fools, like we curse computer illiterate boomers today.


Think about this example you gave: Cars are regulated

We regulate cars, and implicitly the software in it. We do not regulate software in the abstract. We don't monitor mechanics or engineers. People are encouraged to learn and to become educated.

[–] [email protected] 0 points 3 months ago* (last edited 3 months ago)

Of course you regulate software in the abstract. Have you ever heard of the regulations concerning onboard navigation software in planes? It's really strict, and mechanics and engineers that work on that are monitored.

Better exemple: do you think people who work on the targeting algorithms in missiles are allowed to chat about the specifics of their algorithms with chat gpt? Because they aren't.

[–] [email protected] -1 points 3 months ago

I guess let's deregulate guns then. Oh wait.

[–] [email protected] 4 points 3 months ago

I am pretty sure no one has ever built a computer that can't be shut off. Somehow someway.

[–] [email protected] 4 points 3 months ago (1 children)

This bill targets AI systems that are like the ChatGPT series. These AIs produce text, images, audio, video, etc... IOW they are dangerous in the same way that a library is dangerous. A library may contain instructions on making bombs, nerve gas, and so on. In the future, there will likely be AIs that can also give such instructions.

Controlling information or access to education isn't exactly a good guy move. It's not compatible with a free or industrialized country. Maybe some things need to be secret for national security, but that's not really what this bill is about.

[–] [email protected] 1 points 3 months ago (1 children)

Yep nothing about censorship is cool. But for rampaging agi systems, a button to kill it would be nice. However it leads into a game and a paradox on how this could ever be achieved

[–] [email protected] -1 points 3 months ago

I don't see much harm in a "kill switch", so If it makes people happy... But it is sci-fi silliness. AI is software. Malfunctioning software can be dangerous if it controls, say, heavy machinery. But we don't have kill switches for software. We have kill switches for heavy machinery, because that is what needs to be turned off to stop harm.