673
Researchers confirm what we already knew: Google results really are getting worse
(www.theregister.com)
This is a most excellent place for technology news and articles.
I'm finding that LLMs are doing a better job for searching for new things. If I have a question, instead of going to google or bing I'll goto chatGPT and ask some of that nature with some sources for further reading.
Never would I think that I would need to use AI to answer simple search and yet here we are because the sole purpose of a search engine doesn't really exist anymore.
The problem is, you can't trust ChatGPT to not lie to you.
And since generative AI is now being used all over the place, you just can't trust anything unless you know damn well that a human entered the info, and then that's a coin flip.
OTOH, you also can't trust humans not to lie to you.
That's the coin flip.
The newer ones search the internet and generate from the results not their training and provide sources.
So that's not such a worry now.
Anyone who used ChatGPT for information and not text generation was always using it wrong.
Except people are using LLM to generate web pages on something to get clicks. Which means LLM's are training off of information generated by other LLM's. It's an ouroboros of fake information.
But again if you use LLMs ability to understand and generate text via a search engine that doesn't matter.
LLMs are not supposed to give factual answers. That's not their purpose at all.
plus search engines don't lecture me as much for typing naughty sex words
Get on the unfiltered LLM train, they'll do anything GPT does and won't filter anything. Bonus if you run it locally and share with the community.
However, I find it much easier to check if the given answer is correct, instead of having to find the answer myself.