The fake citation landscape: examples from the wild

Nick Morley looks at real examples of hallucinated citations in published research, and explains what they mean for science
You’ve probably heard of academic publishing’s hottest new foe: fake references, or hallucinated citations. This demon comes in many forms and poses a major threat to quality and reputation. In this article we’ll explore a range of real examples that made it past peer review into published literature, to help you understand the risk and what you can do about it.
Phantom sources
The line from real to fake is a blurry continuum, from subtle inconsistencies to complete fabrication. We’ll start with references to sources which do not appear to exist at all:
Holder, T., et al.: Momentum-space topology in materials. Nat. Commun. 12, 2389 (2021)
(from: https://link.springer.com/article/10.1007/s00033-026-02793-x)
While a Tobias Holder has published in Nature Communications on topics related to “momentum space”, no such paper can be found there. A broader Google search for this title yields no close matches.
Examples like this are easy to disprove through a combination of title search plus journal/volume/issue page lookup. However, this author+topic+journal combination would be highly plausible to a domain expert, and so would easily go under the radar without active verification.
More examples like this:
Maignan, I., & Ferrell, O. (2022). Corporate social responsibility and sustainable reputation management. Journal of Strategic Marketing, 30(2), 101–120.
(from https://economics.pubmedia.id/index.php/jampk/article/view/1118)
Jamaldeen J, Tan J, Healy G, et al. Hearing loss in patients with chronic kidney disease: a review of current literature. J Nephrol. 2022;35(1):11–21.
(from https://www.tandfonline.com/doi/full/10.1080/0886022X.2025.2590865)
For journal references such as these, the most effective approach to disprove its existence is to access the journal page for the specific volume and issue provided, but be wary of translations and less accessible journals.
DOI delusions
Don’t we already have the perfect instrument for this – the unique identifier. Can’t we just require DOIs and assert their existence?
Well, a valid DOI does not guarantee a valid reference:
Real DOI, bad reference
Zhang M., Zhang Y., Fu S., Lv F., Tang J. Thyroid nodules with suspicious ultrasound features: the role of ultrasound-guided fine-needle aspiration cytology in clinical practice. Cancer Cytopathol. 2020;128(12):848–856. doi: 10.1002/cncy.22335.
(from: https://linkinghub.elsevier.com/retrieve/pii/S2589-8450(25)00098-3)
The DOI revolves to a completely different article, while the document as cited does not appear to exist.
Conversely, just because the DOI is invalid doesn’t mean the paper isn’t real:
Real paper, bad DOI
- T. H. Miranda, M. G. Hoeppner, C. C. D. Garbelini, et al., “LED Photobiomodulation Effect on the Bleaching-Induced Sensitivity With Hydrogen Peroxide 35% : A Controlled Randomized Clinical Trial”; Clinical Oral Investigations 26, no. 5 (2022): 3853-3864, https://doi.org/10.1007/s00784-021-04263-4.
(from: https://onlinelibrary.wiley.com/doi/10.1111/jerd.70165)
There are many reasons this can arise, from generative AI mishaps, to OCR/parsing issues or the humble typo. Also, over time content may change hands or be re-indexed. DOIs can become obsolete and replaced.
So, DOI validity, while necessary, is not sufficient.
This mismatch between DOI and reference brings us to another category of mismatch: mis-attribution.
Franken-references
Real components combined in a plausible but imaginary way. Take this example of a real paper with different authors spliced in:
Real paper, wrong authors
Gofflot F, Chartoire N, Vasseur L, et al. Functional human beige adipocytes from induced pluripotent stem cells. Diabetes. 2017;66(6):1470–1478. doi: 10.2337/db16-1107
(from: https://www.tandfonline.com/doi/full/10.1080/21541264.2025.2521766)
These three authors are “real” in that they have published together with their names presented exactly as above: Gofflot F, Chartoire N, Vasseur L, et al. Systematic Gene Expression Mapping Clusters Nuclear Receptors According to Their Function in the Brain. Cell. 2007;131(2):405-418
However the actual authors of this paper are Anne-Claire Guénantin, Nolwenn Briand, Émilie Capel, et al (see https://diabetesjournals.org/diabetes/article-abstract/66/6/1470/40058/Functional-Human-Beige-Adipocytes-From-Induced?redirectedFrom=fulltext)
This takes us beyond correctness to the issue of fairness. Such references credit the wrong authors while the right ones miss out. As a result researchers are being alerted as cited on papers they didn’t write, contributing to the growing embitterment of the research community against the state of academic publishing.
Of course it’s not just authors that are misattributed, journals too:
Wrong journal
Kramer BJ, Boelk AZ. Correlates and predictors of conflict at the end of life among families enrolled in hospice. J Soc Work End Life Palliat Care 2015; 11(1): 5-22.
(from: https://journals.sagepub.com/doi/full/10.1177/26323524251326105)
Where the actual journal should be Journal of Pain and Symptom Management.
These are some of the more salient failure modes, but these mismatch errors extend to every combination of the metadata you can imagine.
Hallucination tax
We see this and more every day.
Reference errors like this, once exposed, completely dissolve trust. They call into question the quality and integrity of the work, the institution and the publisher.
Reference errors also materially affect citation metrics by short-circuiting indexing in databases like Web of Science and Scopus. Each time a document is referenced incorrectly, authors and journals risk being shortchanged, harming their h-index or impact factor.
Furthermore, as tools to detect these errors improve, they provide a clearcut way to measure the citation health of a paper or journal. Today, journals are punished for high self-citation or citation-stacking, tomorrow they will be punished for fake or inaccurate citations. Failure to get ahead of this may be a severe risk to longevity.
Entirely fixable
We have explored common citation failure modes with real examples and shone a light on the growing risk. The prevalence of this issue is only starting to become understood, but it’s clear that the levels are already of severe concern.
The good news is that these errors can be caught with simple verification procedures ripe for automation. Checking source existence and metadata correctness can be carried out as a support function to protect people from advancing or consuming information with bad citations. This article is a demonstration of that principle – all examples were identified automatically using the Veracity citation checking platform.
Our goal is to arm the research ecosystem against this problem and prevent bad information from reaching readers. Failure to do this leads to the distrust and downfall of institutions and publishers. This problem is entirely solvable with the right verification procedures in place.
Nick Morley is Chief Citation Checker and head of product at Grounded AI, which provides tools for efficient citation verification at all stages of the research and publication lifecycle.
