this post was submitted on 09 Oct 2024
611 points (96.6% liked)
Technology
60052 readers
4042 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
LLMs have their flaws but for my use it's usually good enough. It's rarely mission critical information that I'm looking for. It satisfies my thirst for an answer and even if it's wrong I'm probably going to forget it in a few hours anyway. If it's something important I'll start with chatGPT and then fact check it by looking up the information myself.
So, let me get this straight...you "thirst for an answer", but you don't care whether or not the answer is correct?
this is like addiction to youtube "top 10 facts" and whatever similar videos
Of course I care whether the answer is correct. My point was that even when it’s not, it doesn’t really matter much because if it were critical, I wouldn’t be asking ChatGPT in the first place. More often than not, the answer it gives me is correct. The occasional hallucination is a price I’m willing to pay for the huge convenience of having something like ChatGPT to quickly bounce ideas off of and ask about stuff.
I agree that AI can be helpful for bouncing ideas off of. It's been a great aid in learning, too. However, when I'm using it to help me learn programming, for example, I can run the code and see whether or not it works.
I'm automatically skeptical of anything they tell me, because I know they could just be making something up. I always have to verify.