this post was submitted on 05 Feb 2024
195 points (89.5% liked)

Technology

59207 readers
3158 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Did nobody really question the usability of language models in designing war strategies?

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 17 points 9 months ago (2 children)

That explanation is obviously based on traditional chess AI. This is about role-playing with chatbots (LLMs). Think SillyTavern.

LLMs are made for text production, not tactical or strategic reasoning. The text that LLMs produce favors violence, because the text that humans produce (and want) favors violence.

[–] [email protected] 5 points 9 months ago

Especially if its training material included comments from the early 00s. There was a lot of "nuke it from orbit" and "glass parking lot" comments about the Middle East in the wake of 911.

And with the glorified text predictors that LLMs are, you could probably adjust the wording of the question to get the opposite results. Like, "what should we do about the Middle East?" might get a "glass parking lot" response, while "should we turn the middle East into a glass parking lot?" might get a "no, nuking the middle East is a bad idea and inhumane" because that's how those conversations (using the term loosely) would go.

[–] [email protected] 2 points 9 months ago

The text that LLMs produce favors violence, because the text that humans produce (and want) favors violence.

That's not necessarily true, there is a lot of violent fiction.