this post was submitted on 20 Mar 2024
53 points (88.4% liked)
Technology
59466 readers
3364 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Obviously it depends on the LLM, but ChatGPT Plus doesn't hallucinate with your example. What it does is provide a list of git projects / windows programs, each with a short summary and a link to the official website.
And the summary doesn't come from the website — the summary is a short description of how it matches your requirements list.
I've also noticed Bing has started showing LLM summaries for search results. For example I've typed a question into Duck Duck Go (which uses Bing internally) and seen links to reddit where the answer is "a user answered your question stating X, and another user disagreed saying Y".
I'm encountering hallucinations far less often now than I used to - at least with OpenAI based products.