this post was submitted on 24 Jul 2024
249 points (97.0% liked)
Technology
59207 readers
2520 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Eventually an AI will be developed that can learn with much less data. In the end we don't need to read the entire internet to get through our education. But, that's not going to be LLM. No matter how much you tweak LLM models, it won't get there. It's like trying to tune a coal fired steam powered car until you can compete in a formula 1 race.
Yeah, it's entirely plausible that LLMs are a small part of the answer as basically the language center of the brain, but the brain is a hell of a lot more complex than that. The language center isn't your whole brain, and is only loosely connected to actual decision making. It confabulates a lot.
OpenAI stumbled on something that worked and ran with it, and people started proclaiming it to be the answer to everything. The same happened with Deep Learning and every AI invention so far. It's all just another stepping stone on the way.
It's already happening. A quote from Andrej Karpathy :