One of the world’s largest academic publishers is selling a book on the ethics of AI intelligence research that appears to be riddled with fake citations, including references to journals that do not exist.
Academic publishing has recently been subject to criticism for accepting fraudulent papers produced using AI, which have made it through a peer-review process designed to guarantee high standards.
The Times found that a book recently published by the German-British publishing giant Springer Nature includes dozens of citations that appear to have been invented — a sign, often, of AI-generated material.

The book — Social, Ethical and Legal Aspects of Generative AI — is advertised as an authoritative review of the ethical dilemmas posed by the technology and is on sale for £125. At least two chapters include footnotes that cite scientific publications that appear to have been invented.
For one chapter, 8 of the 11 citations could not be verified, suggesting more than 70 per cent may have been fabricated.
The findings come amid growing concern within academia about citations and even entire research papers being generated by AI tools that try to mimic genuine scholarly work.
In April, Springer Nature withdrew another technology title — Mastering Machine Learning: From Basics to Advanced — after it was found to contain numerous fictitious references.
In the more recent book analysed by The Times, one citation claims to refer to a paper published in “Harvard AI Journal”. Harvard Business Review has said that no such journal exists.
Guillaume Cabanac, an associate professor of computer science at the University of Toulouse and an expert in detecting fake academic papers, analysed two chapters using BibCheck, a tool designed to identify fabricated references.
He found that at least 11 of 21 citations in the first chapter could not be matched to known academic papers. The analysis also suggested that 8 of the 11 citations in chapter 4 were untraceable.
“This is research misconduct: falsification and fabrication of references,” Cabanac said. He tracks such cases and says he has seen a steady rise in AI “hallucinated” citations across academic literature.
He said: “Researchers build knowledge by relying on previously published research … When [these studies] are fragile or rotten, we can’t build anything robust on top of that.”
A separate review carried out by Dr Nathan Camp of New Mexico State University reached similar conclusions. Camp, who has studied the rise of fake AI-generated citation, found numerous erroneous, mismatched or wholly invented references in the AI ethics book.
In some cases, details of different genuine papers appeared to have been combined. Another six chapters appeared to be accurate. Each chapter has been written by a different set of authors.
Camp said: “While it is difficult to definitively ascertain whether or not the citations used are AI-generated, they are certainly erroneous at best, likely fabricated, and the simplest way to fabricate citations is with AI.”
James Finlay, vice-president for applied sciences books at Springer Nature, said, “We take any concerns about the integrity of our published content seriously. Our specialist research integrity team is investigating this case as a priority.”
He added: “Our integrity team works with editors and uses specialist expertise and detection tools to uphold our standards and catch any integrity issues ahead of time. A small number, however, may slip through.”