Ask Lemmy
A Fediverse community for open-ended, thought provoking questions
Please don't post about US Politics.
Rules: (interactive)
1) Be nice and; have fun
Doxxing, trolling, sealioning, racism, and toxicity are not welcomed in AskLemmy. Remember what your mother said: if you can't say something nice, don't say anything at all. In addition, the site-wide Lemmy.world terms of service also apply here. Please familiarize yourself with them
2) All posts must end with a '?'
This is sort of like Jeopardy. Please phrase all post titles in the form of a proper question ending with ?
3) No spam
Please do not flood the community with nonsense. Actual suspected spammers will be banned on site. No astroturfing.
4) NSFW is okay, within reason
Just remember to tag posts with either a content warning or a [NSFW] tag. Overtly sexual posts are not allowed, please direct them to either [email protected] or [email protected].
NSFW comments should be restricted to posts tagged [NSFW].
5) This is not a support community.
It is not a place for 'how do I?', type questions.
If you have any questions regarding the site itself or would like to report a community, please direct them to Lemmy.world Support or email [email protected]. For other questions check our partnered communities list, or use the search function.
Reminder: The terms of service apply here too.
Partnered Communities:
Logo design credit goes to: tubbadu
view the rest of the comments
Go tell a kalahari bushman to click a button, or log into your amazon account, or send an email, or literally anything you don’t place in front of him as an option.
Is your whole point just that it would be GAI if it weren’t for those darned shackles, but it’s not AGI because we give it restrictions on sending POST requests?
Besides the detail that even Kalahari Bushmen have mobile phones now, primitive humans (or our ancestors) weren't stupid. You could take a human from 1000 years ago and after they stop flipping out about computers and modern technology you'd be able to teach them to click a button in seconds to minutes (depending on how complex you make the task).
General AI can take actions on its own (unprompted) and it can learn, basically modifying its own code. If anyone ever comes up with a real AI we'd go towards the Singularity in no time (as the only limit would be processing power and the AI could then invest time into improving the hardware it runs on).
There are no "shackles" on ChatGPT, it's literally an input output machine. A really damn good one, but nothing more than that. It can't even send a POST request. Sure, you could sit a programmer down, parse the output, then do a request whenever ChatGPT mentions certain keywords with a payload. Of course that works, but then what? You have a dumb chatbot firing random requests and if you try to feed the result of those requests back in it's going to get jumbled up with your text input you made beforehand. Every single action you want an LLM to take you'd have to manually program.
Oh you bastard. You actually tried to reframe my words into exactly the opposite of what I was saying.
I did not use a Kalahari Bushman as an example of a stupid person. I used a Kalahari Bushman as an example of a general intelligence as smart as you or I, who can’t press buttons or buy things on Amazon for reasons of access not capability.
I need to cool down before I read the rest of your comment. Not cool dude, trying to twist what I said into some kind of racist thing. Not cool.
This wasn't my intention at all, we are talking about capabilities here, not access.
You could give ChatGPT every resource in the world, all the processing power, every account credential (usernames, passwords), an unlimited fiber connection with 100 Gbit and zero restrictions on the language model.
It doesn't matter, it's straight up not built to do any actions or as AI. It's an input output machine, text in, text out, that's it.
It's just so damn complex at this point that the text output is really good, but there isn't more to it. Even the capability to "remember" your previous input isn't actually remembering, your next input just goes down a different pathway in the model (which has billions of parameters) to get to your new text output.