this post was submitted on 23 Nov 2023
20 points (60.4% liked)

Ask Lemmy

26753 readers
1743 users here now

A Fediverse community for open-ended, thought provoking questions

Please don't post about US Politics. If you need to do this, try !politicaldiscussion


Rules: (interactive)


1) Be nice and; have funDoxxing, trolling, sealioning, racism, and toxicity are not welcomed in AskLemmy. Remember what your mother said: if you can't say something nice, don't say anything at all. In addition, the site-wide Lemmy.world terms of service also apply here. Please familiarize yourself with them


2) All posts must end with a '?'This is sort of like Jeopardy. Please phrase all post titles in the form of a proper question ending with ?


3) No spamPlease do not flood the community with nonsense. Actual suspected spammers will be banned on site. No astroturfing.


4) NSFW is okay, within reasonJust remember to tag posts with either a content warning or a [NSFW] tag. Overtly sexual posts are not allowed, please direct them to either [email protected] or [email protected]. NSFW comments should be restricted to posts tagged [NSFW].


5) This is not a support community.
It is not a place for 'how do I?', type questions. If you have any questions regarding the site itself or would like to report a community, please direct them to Lemmy.world Support or email [email protected]. For other questions check our partnered communities list, or use the search function.


Reminder: The terms of service apply here too.

Partnered Communities:

Tech Support

No Stupid Questions

You Should Know

Reddit

Jokes

Ask Ouija


Logo design credit goes to: tubbadu


founded 1 year ago
MODERATORS
 

I know it's not even close there yet. It can tell you to kill yourself or to kill a president. But what about when I finish school in like 7 years? Who would pay for a therapist or a psychologist when you can ask for help a floating head on your computer?

You might think this is a stupid and irrational question. "There is no way AI will do psychology well, ever." But I think in today's day and age it's pretty fair to ask when you are deciding about your future.

you are viewing a single comment's thread
view the rest of the comments
[โ€“] [email protected] 12 points 11 months ago (1 children)

Care to elaborate?

It's an oversimplified statement from someone (sorry I don't have the source) and I'm not exactly an AI expert but my understanding is the current commercial AI products are nowhere near the "think and judge like a human" definition. They can scrape the internet for information and use it to react to prompts and can do a fantastic job to imitate humans, but the technology is simply not there.

[โ€“] [email protected] -1 points 11 months ago* (last edited 11 months ago)

The technology for human intelligence? Any technology would be always very different from human intelligence. What you probably are referring to is AGI, that is defined as artificial general intelligence, which is an "intelligent" agent that doesn't excel in anything, but is able to handle a huge variety of scenarios and tasks, such as humans.

LLM are specialized models to generate fluent text, but very different from autocompletes because can work with concepts, semantics and (pretty surprisingly) with rather complex logic.

As oversimplification even humans are fancy autocomplete. They are just different, as LLMs are different.