By using a completely different tool. LLMs are fine for sentence structure, but they aren't intelligent. There is no capacity to distinguish fact from fiction, or to form an underlying model of reality to draw answers from.
Ask Lemmy
A Fediverse community for open-ended, thought provoking questions
Please don't post about US Politics. If you need to do this, try !politicaldiscussion
Rules: (interactive)
1) Be nice and; have fun
Doxxing, trolling, sealioning, racism, and toxicity are not welcomed in AskLemmy. Remember what your mother said: if you can't say something nice, don't say anything at all. In addition, the site-wide Lemmy.world terms of service also apply here. Please familiarize yourself with them
2) All posts must end with a '?'
This is sort of like Jeopardy. Please phrase all post titles in the form of a proper question ending with ?
3) No spam
Please do not flood the community with nonsense. Actual suspected spammers will be banned on site. No astroturfing.
4) NSFW is okay, within reason
Just remember to tag posts with either a content warning or a [NSFW] tag. Overtly sexual posts are not allowed, please direct them to either [email protected] or [email protected].
NSFW comments should be restricted to posts tagged [NSFW].
5) This is not a support community.
It is not a place for 'how do I?', type questions.
If you have any questions regarding the site itself or would like to report a community, please direct them to Lemmy.world Support or email [email protected]. For other questions check our partnered communities list, or use the search function.
Reminder: The terms of service apply here too.
Partnered Communities:
Logo design credit goes to: tubbadu
Excellent point. Everyone say it with me:
LLMs are not AI.
I'm curious what makes it not AI? It definitely seems like AI, it's able to learn and create new sentences.
It is absolutely ai. The experience of talking with chatgpt is so human like that it just blows my mind. What I've learned so far is that human brains aren't nearly as magical as they seem.
I tend to agree, however LLMs aren't the entirety of the AI field at the moment, despite them receiving a large amount of attention. This question is open to all forms of AI under development.
I don't think it is possible, yet.
AI is still at the big money, big technical investment stage. It will be a decade or more before what you are talking about will be possible.
Aren't there already a few free and open source tools available though? That's a part of what inspired this question tbh.
The codebases are free, but the training sets are not. To have intelligence like you see in GPT-4 you need a lot of training data that is expensive to put together.
Honestly if I, the underdog, want to utilize AI for my goals by best bet is to pay $20/mo for the AI from OpenAI.
I thought the main obstacle was the computing power to update 175 billion neurons with large datasets. You probably could generate a good llm just using Wikipedia, but I think it requires a room full of expensive video cards to do.
a lot of training data that is expensive to put together.
Isn't training data simply data? If a community were to agree to pool their data together to enable the AI, wouldn't that bypass the cost issue? Or is this one of those situations where the amount of data required thoroughly demonstrates how much businesses have arguably stolen from the public, and in turn no community may produce sufficient data to enable their AI tools to the same degree?