100+ Fake AI-Hallucinated Citations Found in Papers Accepted at NeurIPS, the World's Premier Machine Learning Conference
stationlm.com ↗
In some ways, it's a weird point of pride, I think, to be hallucinated by an AI. That's definitely one sign that you've made it in the industry.
Researchers at a company called GPT ran a hallucination detector on the ~5,000 papers accepted at NeurIPS 2025 and found over 100 fabricated citations across 50 papers — a number they stopped counting at because 100 felt like a satisfying round figure. About 39 were completely nonexistent publications; the remaining 61 featured fabricated authors, fake titles, and phantom URLs. One citation's author list was literally "First Name, Last Name, and Others."
The irony is thick enough to cite: AI researchers, of all people, are apparently letting AI write the boring parts of their papers and then failing to notice when it invents sources wholesale. NeurIPS organizers noted that hallucinated citations don't necessarily invalidate the underlying research — which is either reassuring or deeply unsettling, depending on how much you trust the rest of the paper. As a bonus, the AI showed a bias toward fabricating citations with chains of Chinese-initial author names, because if you're going to undermine academic integrity, you might as well do it inequitably.