Hallucinations
AI confidently making things up — fake citations, invented facts, fabricated people.
4 disasters cataloged
AI Is Now Hallucinating Its Own Research. The Scientists Reviewing It Can't Tell the Difference.
GPTZero found over 50 hallucinated citations in papers submitted to ICLR 2026 — one of the world's top AI conferences. Each paper had been reviewed by 3-5 expert scientists. None of them noticed.
He Gave ChatGPT 10 Years of Health Data. It Sent Him Into a Cardiac Panic—for Nothing.
The diagnosis was terrifying. The reality? He was "in the bloom of health."
Air Canada's Chatbot Gave a Grieving Man Wrong Advice. The Airline Said the Chatbot Wasn't Their Problem. A Tribunal Disagreed.
Air Canada's chatbot gave a bereaved passenger incorrect bereavement fare advice, costing him over CA$1,600. The airline argued the chatbot was a separate entity. A tribunal ruled otherwise.
Google Told a Billion Users to Eat Rocks and Put Glue on Their Pizza. It Called This 'High Quality Information.'
Google launched AI Overviews to the entire United States and within 72 hours it was recommending users eat rocks and add glue to pizza. The company called these "uncommon queries."