Chatbots sometimes make things up. Not everyone thinks AI’s hallucination problem is fixable

Chatbots sometimes make things up. Not everyone thinks AI’s hallucination problem is fixable

Spend enough time with ChatGPT and other artificial intelligence chatbots and it doesn’t take long for them to spout falsehoods.

Described as hallucination, confabulation or just plain making things up, it’s now a problem for every business, organization and high school student trying to get a generative AI system to compose documents and get work done. Some are using it on tasks with the potential for high-stakes consequences, from psychotherapy to researching and writing legal briefs.

 

Chatbots sometimes make things up. Not everyone thinks AI’s hallucination problem is fixable

Chatbots sometimes make things up. Not everyone thinks AI’s hallucination problem is fixable