this post was submitted on 10 Dec 2023
160 points (85.4% liked)

Technology

60052 readers
2912 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 2 points 1 year ago (4 children)

Thats a fun thought experiment at least. Is there any way for an AI to gain physical control on its own, within the bounds of software. It can make programs and interact with the web.

Some combination of bank hacking, 3D modeling, and ordering 3D prints delivered gets it close, but i dont know if it can seal the deal without human assistance. Some kind of assembly seems necessary, or at least powering on if it just orders a prebuilt robotic appendage.

[–] [email protected] 4 points 1 year ago (1 children)

inhabiting a boston dynamics robot would probably be the best option

i’d say it could probably use airtasker to get people to unwittingly do assembly of some basic physical form which it could use to build more complex things… i’d probably not count that as “human assistance” per se

[–] [email protected] 2 points 1 year ago (2 children)

inhabiting a boston dynamics robot would probably be the best option

Already been done: https://www.youtube.com/watch?v=djzOBZUFzTw

[–] [email protected] 2 points 1 year ago

I fucking love it

[–] [email protected] 1 points 1 year ago

i think this is the perfect time for the phrase “thanks i hate it”

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago)

I really don't think so. This is 15 years of factory/infrastructure experience here. You are going to need a human to turn a screwdriver somewhere.

I don't think we need to worry about this scenario. Our hypothetical AI can just hire people. It isn't like there would be a shortage of people who have basic assembly skills and would not have a moral problem building what is clearly a killbot. People work for Amazon, Walmart, Boeing, Nestle, Haliburton, Atena, Goldman Sachs, Faceboot, Comcast, etc. And heck even after it is clear what they did it isnt like they are going to feel bad about it. They will just say they needed a job to pay the bills. We can all have an argument about professional integrity in a bunker as drones carrying prions rain down on us.

[–] [email protected] 1 points 1 year ago

“Hey Timmy, if you solder these components I’ll tell you how to get laid”

[–] [email protected] 1 points 1 year ago (2 children)

That, in my mind, is a non-threat. AIs have no motivation; there's no reason for an AI to do any of that.

Unless it's being manipulated by a bad actor who wants to do those things. THAT is the real threat. And we know those bad actors exist and will use any tool at their disposal.

[–] [email protected] 2 points 1 year ago (1 children)

They have the motivation of whatever goal you programmed them with, which is probably not the goal you thought you programmed it with. See the paperclip maximiser.

[–] [email protected] 0 points 1 year ago (1 children)

I'm familiar with that thought exercise, but I find it to be fearmongering. AI isn't going to be some creative god that hacks and breaks stuff on its own. A paperclip maximizer AI isn't going to manipulate world steel markets or take over steel mills unless that capability is specifically built into its operating parameters.

The much greater risk in the near term is that bad actors exploit AI to accomplish very specific immoral, illegal, or exploitative tasks by building those tasks into AI. Such as deepfakes, or using drones to track and murder people, etc. Nation-state actors will probably start using this stuff for truly horrible reasons long before criminals do.

[–] [email protected] 1 points 1 year ago

I wonder if you can describe the operating parameters of GPT-4

[–] [email protected] 1 points 1 year ago

Nuclear weapons have no motivation