this post was submitted on 23 May 2025
72 points (84.0% liked)

Technology

70267 readers
3960 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 
  • Anthropic’s new Claude 4 features an aspect that may be cause for concern.
  • The company’s latest safety report says the AI model attempted to “blackmail” developers.
  • It resorted to such tactics in a bid of self-preservation.
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 4 points 12 hours ago (2 children)

Computer chips, simplified, consume inputs of 1s and 0s. Given the correct series, it will add two values, or it will multiply two values, or some other basic function. This seemingly basic functionality, done in very specific order, creates your calculator, Minesweeper, Pac-Man, Linux, World of Warcraft, Excel, and every LLM. It is incredible the number of things you can get a computer to do with just simple inputs and outputs. The only difference between these examples, on a basic, physics level, is the order of 0s and 1s and what the resulting output of 0s and 1s should be. Why should I consider an LLM any more sentient than Windows95? They're the same creature with different inputs, one of which is specifically designed to simulate human communication, just as Flight Simulator is designed to simulate flight.

[–] [email protected] 3 points 9 hours ago

That's just the hardware. The human brain also just has tons of neurons in the end working with analogue values, which can in theory be done with floating point numbers on computer hardware.

I'm not arguing for LLM sentience, those things are still dumb and have no interior mutability leading to us projecting consciousness. Just that our neurons are fundamentally not so complicated that a computer couldn't be used to do the same concept (neural networks are already quite a thing after all)

[–] [email protected] 6 points 12 hours ago* (last edited 12 hours ago) (1 children)

Interesting perspective, I can’t waive it away.

I however cant help but think we have some similar “analogues” in the organic world. Bacteria and plants are composed of the same matter as us and we have similar basic processes however there’s a difference in complexity and capacity for thought that sets us apart, which is what makes animals sentient.

Then there’s insects of whom we’re not very sure about yet. They don’t seem to think, but they respond at some level to inputs and they exhibit self preservation instincts. I don’t think they are sentient, so maybe LLMs are like insects? Complex enough to have similar behavior as sentient beings but not enough to be considered sentient?

[–] [email protected] 2 points 7 hours ago (1 children)

wait are insects not considered 'sentient' ?

[–] [email protected] 1 points 6 hours ago

Last I checked no, their nervous system was considered too simple for that. But I think I also read somewhere that a researcher had proof that bees had emotional states, so maybe I’m behind.