HeadlinesBriefing favicon HeadlinesBriefing.com

EY Withdraws AI Hallucination Study After Fabricated Data

Financial Times Companies •
×

EY has withdrawn a cybersecurity study after researchers discovered AI hallucinations including fabricated data, mis-attributed citations, and fake footnotes. The "Points of Attack" report, used by EY Canada consultants to market services, claimed the loyalty scheme market was $200bn while stating unclaimed points equaled exactly the same figure.

GPTZero researchers found more than half of the footnotes led to non-existent web pages or didn't contain cited information. The report referenced a McKinsey study that does not exist. This follows similar issues at Deloitte, which had to revise a government report last year with fake academic citations, and Sullivan & Cromwell's misquoting of US bankruptcy code.

The incident raises concerns about AI reliability in professional services. Despite these problems, EY claims its AI-related revenue grew 30% last year with 15,000 staff working on client AI projects. The firm removed the study and is reviewing publication circumstances, emphasizing its "commitment to responsible AI use" in a competitive market where AI adoption is key.