this post was submitted on 16 Oct 2023
23 points (61.9% liked)

Technology

34912 readers
345 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 3 points 1 year ago (6 children)

This is an unfortunate misunderstanding, one that's all too common. I've also seen comments like "It's no more intelligent than a dictionary". Try asking Eliza to summarize a PDF for you, and then ask followup questions based on that summary. Then ask it to list a few flaws in the reasoning in the PDF. LLMs are so completely different from Eliza that I think you fundamentally misunderstand how they work. You should really read up on them.

[–] [email protected] 0 points 1 year ago (5 children)

Give Eliza equivalent compute time and functionality to interpret the data type and it probably could get something approaching a result. Modern LLMs really benefit from massive amounts of compute availability and being able to "pre-compile" via training.

They're not, in and of themselves, intelligent. That's not something that is seriously debated academically, though the dangers of humans misperceiving them as such very much is. They may be a component of actual artificial intelligence in the future and are amazing tools that I'm getting done hands-on time with, but the widespread labeling them as "AI" is pure marketing.

[–] [email protected] 4 points 1 year ago (4 children)

Give Eliza equivalent compute time and functionality to interpret the data type and it probably could get something approaching a result.

Sorry, but this is simply incorrect. Do you know what Eliza is and how it works? It is categorically different from LLMs.

That’s not something that is seriously debated academically

This is also incorrect. I think the issue that many people have is that they hear "AI" and think "superintelligence". What we have right now is indeed AI. It's a primitive AI and certainly no superintelligence, but it's AI nonetheless.

There is no known reason to think that the approach we're taking now won't eventually lead to superintelligence with better hardware. Maybe we will hit some limit that makes the hype die down, but there's no reason to think that limit exists right now. Keep in mind that although this is apples vs oranges, GPT-4 is a fraction of the size of a human brain. Let's see what happens when hardware advances give us a few more orders of magnitude. There's already a huge, noticeable difference between GPT 3.5 and GPT 4.

[–] [email protected] 2 points 1 year ago* (last edited 1 year ago) (1 children)

To add something, as you mentioned gpt-4 neurons are only a fraction of a human brain.

The entire human brain runs on 10-20 watt, thats about a single lightbulb to do all the computing needed for conscious intelligence.

Its crazy how optimized natural life is and we have a lot left to learn.

[–] [email protected] 1 points 1 year ago

Its crazy how optimized natural life is and we have a lot left to learn.

It's a fun balance of both excellent and terrible optimization. The higher amount of noise is a feature and may be a significant part of what shapes our personalities and ability to create novel things. We can do things with our meat-computers that are really hard to approximate in machines, despite having much slower and lossier interconnects (not to mention much less reliable memory and sensory systems).

load more comments (2 replies)
load more comments (2 replies)
load more comments (2 replies)