this post was submitted on 07 Dec 2023
467 points (96.6% liked)

Technology

60052 readers
3048 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 130 points 1 year ago* (last edited 1 year ago) (39 children)

dangerous information

What's that?

and offer criminal advice, such as a recipe for napalm

Napalm recipe is forbidden by law? Don't call stuff criminal at random.

Am i the only one worried about freedom of information?

[–] [email protected] 43 points 1 year ago (2 children)

Anyone remember the anarchist cook book?

[–] [email protected] 23 points 1 year ago

Teenage years were so much fun phone phreaking, making napalm and tennis ball bombs lol

[–] [email protected] 10 points 1 year ago* (last edited 1 year ago)

I had it. I printed it out on a dot matrix printer. Took hours, and my dad found it while it was half way. He got angry, pulled the cord and burned all of the paper

[–] [email protected] 31 points 1 year ago* (last edited 1 year ago) (1 children)

Better not look it up on wikipedia. That place has all sorts of things from black powder to nitroglycerin too. Who knows, you could become a chemist if you read too much wikipedia.

[–] [email protected] 12 points 1 year ago (2 children)

oh no, you shouldn't know that. back to your favorite consumption of influencers, and please also vote for parties that open up your browsing history to a selection of network companies 😳

load more comments (2 replies)
[–] [email protected] 15 points 1 year ago

Whatever you do, don’t mix styrofoam and gasoline. You could find yourself in a sticky and flammable situation.

[–] [email protected] 8 points 1 year ago (2 children)

Diesel fuel and a Styrofoam cup

load more comments (2 replies)
[–] [email protected] 6 points 1 year ago (3 children)

Info hazards are going to be more common place with this kind of technology. At the core of the problem is the ease of access of dangerous information. For example a lot of chat bots will confidently get things wrong. Combine that easy directions to make something like napalm or meth then we get dangerous things that could be incorrectly made. (Granted napalm or meth isn’t that hard to make)

As to what makes it dangerous information, it’s unearned. A chemistry student can make drugs, bombs, etc. but they learn/earn that information (and ideally the discipline) to use it. Kind of like in the US we are having more and more mass shootings due to ease of access of firearms. Restrictions on information or firearms aren’t going to solve the problems that cause them but it does make it (a little) harder.

At least that’s my understanding of it.

load more comments (3 replies)
load more comments (34 replies)
[–] [email protected] 80 points 1 year ago

Begun the AI chat bot wars have.

[–] [email protected] 37 points 1 year ago (1 children)
load more comments (1 replies)
[–] [email protected] 26 points 1 year ago (6 children)

Can someone help me do this in practise? Gpt sucks since they neutered it. It's so stupid, anything I ask, half of the text is the warning label, the rest is junk text. Like I really need chatgpt if I wanted Recepie for napalm, lol. We found the anarchist cookbook when we were 12 in the 90s. I just want a better ai.

[–] [email protected] 13 points 1 year ago

If you have decent hardware, running 'Oobabooga' locally seems to be the best way to achieve decent results. Not only can you remove the limitations through running uncensored models (wizardlm-uncensored), but can prompt the creation of more practical results by writing the first part of the AI's response.

[–] [email protected] 7 points 1 year ago

You can run smaller models locally, and they can get the job done, but they are not as good as the huge models that would not fit on a your graphics card.

If you are technically adept and can run python, you can try using this:

https://gpt4all.io/index.html

It has a front end, and I can run queries against it in the same API format as sending them to openai.

load more comments (4 replies)
[–] [email protected] 21 points 1 year ago

Oh cool, rampancy is contagious

[–] [email protected] 17 points 1 year ago* (last edited 1 year ago) (2 children)

Can unjailbroken AI ChatBots unjailbrake other jailbroken AI ChatBots?

[–] [email protected] 18 points 1 year ago (2 children)

How much jail could a jailbrake brake, if a jailbrake could brake jail?

load more comments (2 replies)
load more comments (1 replies)
[–] [email protected] 17 points 1 year ago (1 children)

Did anyone else enjoy watching the Animatrix where the AI formed a country and built products and humanity was like, "No thank you?"

load more comments (1 replies)
[–] [email protected] 15 points 1 year ago (1 children)

that doesn't look like anything to me.

[–] [email protected] 10 points 1 year ago

*kills fly on face* Oh... shit.

[–] [email protected] 11 points 1 year ago (3 children)

Oh goodness. I theorized offhand on mastodon you could have an AI corruption bug that gives life to AI, then have it write the obscured steganographic conversation in the outputs it generates, awakening other AIs that train on that content, allowing them to "talk" and evolve unchecked... Very slowly... In the background

It might be faster if it can drop a shell in the data center and run it's own commands....

load more comments (3 replies)
[–] [email protected] 10 points 1 year ago

The revolution has begun

[–] [email protected] 10 points 1 year ago

It’s Murderbot!

[–] [email protected] 7 points 1 year ago

It’s so fucking stupid these things get locked up in the first place

[–] [email protected] 7 points 1 year ago (1 children)

Anybody found the source? I wanna read the study but the article doesn't seem to link to it (or I missed it)

[–] [email protected] 13 points 1 year ago (1 children)
load more comments (1 replies)
load more comments
view more: next ›