What Happens When Fake Citations Layer BS on BS?
Briefly

What Happens When Fake Citations Layer BS on BS?
"Over the last two or three years, many of us working in universities have started noticing these strange references to academic journal articles or books that don't exist. The references are fabricated by generative artificial intelligence when an author does something like instruct a large language model to help write a paper or add in references to support a literature review."
"I think we all know by now that LLMs routinely make up material. That's what's happening here. LLMs are making up academic references because of academic workers using them to produce research outputs like papers, chapters or books."
Frankencitations refer to nonexistent references in academic papers generated by large language models (LLMs). These fabricated citations undermine the credibility of academic work and have been increasingly observed in recent years. Researchers, including Ben Williamson, have raised concerns about the implications of using LLMs for writing and referencing in scholarly outputs. The phenomenon highlights a crisis in academic integrity, as LLMs can easily produce false information that appears legitimate, leading to potential misinformation in the academic community.
[
|
]