this post was submitted on 21 Sep 2023
214 points (92.5% liked)

Technology

59374 readers
7113 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 1 points 1 year ago (8 children)

Okay, well, if everyone had access to an AGI, anyone could design and distribute a pathogen that could wipe out a significant portion of the population. Then again, you'd have the collective force of everyone else's AI countering that plot.

I think that putting that kind of power into the hands of everyone shouldnt be done lightly.

[–] [email protected] 7 points 1 year ago

There are papers online on how to design viruses. Now to get funding for a lab and staff, because this is nothing like Breaking Bad.

[–] [email protected] 6 points 1 year ago

You still can't manufacture it. Your comparision with nukes is actually a good example: The basic knowledge how a nuke works is out there, yet most people struggle in refining weapon-grade plutonium.

Knowledge is only one part in doing something.

[–] [email protected] 5 points 1 year ago (1 children)

Since when does AI translate to being able to create bacteria and stuff?

If having the information on how to do so was enough to create pathogens, we should already have been wiped out because of books and libraries.

[–] [email protected] 0 points 1 year ago* (last edited 1 year ago) (4 children)

You can't type "How do I make a pathogen to wipe out a city" into a book. A sufficiently advanced and aligned AI will, however, answer that question with a detailed list of production steps, resource requirements and timeline.

[–] [email protected] 5 points 1 year ago (1 children)

Oog, what if by making this fire, it burns down the forest?

[–] [email protected] 2 points 1 year ago (1 children)

Well that did happen to be fair.

[–] [email protected] 1 points 1 year ago
[–] [email protected] 4 points 1 year ago

Right. So, the actual danger here is... Search engines?

[–] [email protected] 0 points 1 year ago

this requires special materials like enzymes and such. It would much easier to restrict access to those. Now true this godlike ai could go back to show you how to make all the base stuff but you need equipment for this like centrifuges and you will need special media. Its like the ai telling you how to make a nuke really. Yeah it could star you off with bronze age metal smithing and you could work your way up to the modern materials you would need but realistically you won't be able to do it (assuming again you restrict certain materials)

[–] [email protected] 0 points 1 year ago

Have you heard about this thing called the internet?

[–] [email protected] 5 points 1 year ago (1 children)

I would say the risk of having AI be limited to the ruling elite is worse, though - because there wouldn't be everyone else's AI to counter them.

And if AI is limited to a few, those few WILL become the new ruling elite.

[–] [email protected] 6 points 1 year ago (1 children)

And people would be less likely to identify what AI can and can't do if we convince ourselves to limit our access to it.

[–] [email protected] 1 points 1 year ago

People are already incompetent enough at this when there's a disclaimer in front of their faces warning about gpt.

We're seeing responses even in this thread conflating AGI with LLMs. People at large are too fucking stupid to be trusted with this kind of thing

[–] [email protected] 3 points 1 year ago

Are we back to freaking out about the anarchists cookbook

[–] [email protected] 2 points 1 year ago (2 children)

You're just gonna print the pathogens with the pathogen printer? You understand that getting the information doesn't mean you're able to produce it.

[–] [email protected] 4 points 1 year ago

I need an article on how a 3d printer can be used to print an underground chemistry lab to produce these weapons grade pathogens

[–] [email protected] 1 points 1 year ago (1 children)

That's the thing though: a sufficiently advanced intelligence will know how. You don't have to.

[–] [email protected] 2 points 1 year ago

I know how to build a barn. Doesn't mean I can do it by myself with no tools or materials.

Turns out that building and operating a lab that can churn out bespoke pathogens is actually even more difficult and expensive than that.

[–] [email protected] 1 points 1 year ago

Your brain is an (NA)GI

[–] [email protected] 0 points 1 year ago

Let's assume your hypothetical here isnt bonkers: How, exactly, do you propose limiting people's access to linear algebra?