this post was submitted on 23 Nov 2024
360 points (89.3% liked)

Technology

60071 readers
3505 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
 

I'm usually the one saying "AI is already as good as it's gonna get, for a long while."

This article, in contrast, is quotes from folks making the next AI generation - saying the same.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 13 points 1 month ago (1 children)

I don't think your brain can be reasonably compared with an LLM, just like it can't be compared with a calculator.

[–] [email protected] 21 points 1 month ago (1 children)

LLMs are based on neural networks which are a massively simplified model of how our brain works. So you kind of can as long as you keep in mind they are orders of magnitude more simple.

[–] [email protected] 6 points 4 weeks ago (1 children)

At some point it becomes so “simplified” it’s arguably just not the same thing, even conceptually.

[–] [email protected] 0 points 4 weeks ago* (last edited 4 weeks ago) (1 children)

It is conceptually the same thing. A series of interconnected neurons with a firing threshold and weighted connections.

The simplification comes with how the information is transmitted and how our brain learns.

Many functions in the human body rely on quantum mechanical effects to function correctly. So to simulate it properly each connection really needs to be its own super computer.

But it has been shown to be able to encode information in a similar way. The learning the part is not even close.

[–] [email protected] 1 points 3 weeks ago (1 children)

It is conceptually the same thing. [...] The learning the part is not even close.

Well... isn't the "learning part" precisely the point? I don't think anybody is excited about brains as "just" a computational device, rather the primary function of a brain is ... learning.

[–] [email protected] 1 points 3 weeks ago (1 children)

No, we are nowhere close to learning as the human brain does. We don't even really understand how it does at all.

The point is to encode solutions to problems that we can't solve with standard programming techniques. Like vision, speech recognition and generation.

These problems are easy for humans and very difficult for computers. The same way maths is super easy for computers compared to humans.

By applying techniques our neurones use computer vision and speech have come on in leaps and bounds.

We are decades from getting anything close to a computer brain.

[–] [email protected] 1 points 3 weeks ago

No, we are nowhere close to learning as the human brain does. We don’t even really understand how it does at all.

Sorry then if I sound like a broken record but again, doesn't that mean that the analogy itself is flawed? If the goal remain the same but there is close to no explanatory power, even if we do get pragmatically useful result (i.e. it "works" in some useful cases) it's basically "just" inspiration, which is nice but is basically branding more than anything else.