Microsoft’s AI Tool Targets Factual Errors in AI Text
Do you know the world record for crossing the English Channel on foot? Or the last time someone carried the Golden Gate Bridge across Egypt? These are, without a doubt, ridiculous questions. Still, when users asked chatbots for the answers, the models produced legitimate-sounding responses. These answers are AI hallucinations, a massive problem within text created by artificial intelligence. A recent addition to Microsoft’s AI tool aims to curb the hallucination problem by fact-checking AI text.


