When German journalist Martin Bernklautyped his name and location into Microsoft’s Copilot to see how his articles would be picked up by the chatbot, the answers horrified him. Copilot’s results asserted that Bernklau was an escapee from a psychiatric institution, a convicted child abuser, and a conman preying on widowers. For years, Bernklau had served as a courts reporter and the AI chatbot had falsely blamed him for the crimes whose trials he had covered.
The accusations against Bernklau weren’t true, of course, and are examples of generative AI’s “hallucinations.” These are inaccurate or nonsensical responses to a prompt provided by the user, and they’re alarmingly common. Anyone attempting to use AI should always proceed with great caution, because information from such systems needs validation and verification by humans before it can be trusted.
But why did Copilot hallucinate these terrible and false accusations?
These are not hallucinations whatever thay is supposed to mean lol
Tool is working as intended and getting wrong answers due to how it works. His name frequently had these words around it online so AI told the story it was trained. It doesn't understand context. I am sure you can also it clearify questions and it will admit it is wrong and correct itself...
AI🤡
https://cloud.google.com/discover/what-are-ai-hallucinations#:~:text=AI%20hallucinations%20are%20incorrect%20or,medical%20diagnoses%20or%20financial%20trading.
AI hallucinations are incorrect or misleading results that AI models generate. These errors can be caused by a variety of factors, including insufficient training data, incorrect assumptions made by the model, or biases in the data used to train the model. A
Yes, hallucination is the now standard term for this, but it's a complete misnomer. A hallucination is when something that does not actually exist is perceived as if it were real. LLMs do not perceive, and therefor can't hallucinate. I know, the word is stuck now and fighting against it is like trying to bail out the tide, but it really annoys me and I refuse to use it. The phenomenon would better be described as a confabulation.