How Few Papers Ever Get Cited? It’s Bad, But Not THAT Bad

words (Photo by Jez Arnold)
It’s an academic wasteland at times: Four out of five papers in the humanities never get cited, according to one study. (Photo by Jez Arnold)
“90% of papers published in academic journals are never cited.” This damning statistic from a 2007 overview of citation analysis recently darted about cyberspace. A similar statistic had made the rounds in 2010 but that time it was about 60 percent of social and natural science articles that were said to be uncited. Neither statistic came with a link to supporting academic research papers.

Dahlia Remler
This article by Dahlia Remler originally appeared on her personal blog,, under the title “Are 90% of academic papers really never cited? Searching citations about academic citations reveals the good, the bad and the ugly.” It appears here with her permission.

That lack of support was a problem for me. I did not doubt the basic truth that many academic papers are uncited. But to be sure 90 percent was not urban legend and to learn the context and caveats, I needed to find the original research paper. I was not the only one who wanted the supporting evidence.  So, I dove into Google scholar, searching the disparaged academic literature for articles on academic citation rates.

What’s the truth?

Many academic articles are never cited, although I could not find any study with a result as high as 90 percent. Non-citation rates vary enormously by field. “Only” 12 percent of medicine articles are not cited, compared to about 82 percent (!) for the humanities. It’s 27 percent for natural sciences and 32 percent for social sciences (cite). For everything except humanities, those numbers are far from 90 percent but they are still high: One third of social science articles go uncited!

Ten points for academia’s critics.

(Before we slash humanities departments, though, remember that much of their most prestigious research is published in books. On the other hand, at least in literature, many books are rarely cited, too.)

The uncited rate is also sensitive to other factors: how long a window is used to check for citations (e.g., five years); when the article whose cites are being counted was published (2000s or 1990s); and what counts as a citation. The uncited rates I gave as “the” rates are really five-year citation rates in all Thomson’s Web of Science journals, and that is not comprehensive. The details of whether to include self-citations, non-academic articles, and so on, also matter.

Another reason for the various uncitedness rates floating around is confusion between the average citation rates of journals and citation rates of articles. Within a given journal, some articles have many citations while others have few and many have zero—citations within a given journal are skewed.  The average rate of citations for a whole journal, the impact factor, is pulled up by the few articles with many citations. Focusing on the impact factor will make it seem like more articles get cited than actually do. Ironically, a Chronicle of Higher Education article bemoaning the low rate of citations under-stated its case by assuming the average citation rate for journals applied to articles.

Clearly, academic articles have a serious problem. But my experiences highlighted another bad thing about academic articles—and a really good thing.

I had a hard time finding the rates at which articles were uncited, because the overwhelming majority of relevant articles were about other things, such as the effect of time windows, different inclusion criteria for citations, whether the Internet has changed citation practices and so on. Those are all good things to investigate, but in the grand scheme of things, they are not as important as the large share of articles going uncited altogether. Another point for academia’s critics, who contend that academics worry about small things no one else cares about and miss the big things.

But my experience also showed what’s great about academic articles. You get to learn how people reached their conclusions and judge the methods yourself. You also get the assurance that knowledgeable people paid attention to how things were done and the validity of the conclusions. Contrast the accuracy and information in the academic articles that I have linked to with the figures from non-academic outlets that darted around the Internet.

For everything except humanities, those numbers are far from 90 percent but they are still high: One third of social science articles go uncited!

And what about that 90 percent figure? It came from an article by an expert in citation analysis, Lokman Meho, then at Indiana University, in Physics Today, a (journalistic) magazine associated with the American Institute of Physics. No wonder it got (inaccurately) described in cyberspace as a “study at Indiana University.”

Meho explained the 90 percent by email, “The first paragraph of the article was written by the editor of the magazine and not me. If I recall correctly, he got the figures from/during a lecture he attended in the UK in which the presenter claimed that 90% of the published papers goes uncited and 50% remains unread other than by authors/referees/editors.” Meho noted that the 90 percent figure was about right for the humanities but not other fields.

Five points for academia’s supporters. No editor could do anything remotely like that to an academic article. (It’s bad journalism too.) In academic articles, all methods are explained and all claims are supposed to be evaluated by other experts.

Academic publication needs fixing. Even the 12 percent uncited rate for medicine seems large to me, particularly given what medical research costs. The one-third rate for social science and more than 80 percent for humanities are really troubling. But whatever we do, let’s preserve somewhere what’s good about academic articles — full descriptions of methods and expert evaluation.

0 0 votes
Article Rating

Dahlia Remler

Dahlia Remler is a professor at the School of Public Affairs, Baruch College, City University of New York and the Department of Economics, Graduate Center, City University of New York. She is also a research associate at the National Bureau of Economic Research.

Notify of

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Oldest Most Voted
Inline Feedbacks
View all comments

I agree with Daniel_L – not cited doesn’t mean not read. I wish there was a way of accrediting papers for their use in undergraduate teaching that was equivalent to monitoring citations. Also in some areas 5 years is nothing – e.g. taxonomy and in others the work is out of date after even a few years.

Richard A.

If you go onto ISI and search how many articles are never cited, have a single citation or 2 citations, the results are quite consistent. Published articles that have zero citations range from 12-20%. Those with 2 or fewer range for 16-25%. The reason in part is that ISI includes meeting abstracts that are rarely cited (about 50% of the non-cited articles). Thus, articles that are never cited is 5-10% range, which is much lower than the 65-90% discussed in other parts of this discussion (this is primarily for the physical sciences). Certainly a thought provoking discsion!


I am writing an article for the Canadian Journal of Chemical Engineering. In fact, 90 % of scientific articles are cited at least one time. The data base is Web of Science Core Collection and the time frame
I chose is 2000 to 2005. Of the 5 million papers in the humanities, sciences, engineering and medicine only about 500 000 are uncited until now. The citation rates increase (as low as 98 %) in some fields but in the humanities they are lower but still around 70 %.


Robert Matthews

Really appreciated this debunking effort. Always fascinating to see how a myth gets started, and to have someone dig out the references. Thanks !

Ira Allen

This is interesting to see–thanks for sharing this research! In such discussion, I think it’s important to recall how very different are the reading practices back behind citational practices. So, in much of the humanities, people primarily cite articles that they’ve read reasonably carefully and–most importantly–are engaging with in some way. Sure, you get a little drive-by citing (I’ve been guilty of it myself, as a friend recently reminded me regarding a reference I made to a book of his), but it’s not the absolute norm. Because most humanities fields don’t see knowledge as a positive accumulation of justifiable true… Read more »


The research is very nice to have, and this is certainly a conversation worth having. I wonder, though, whether a 100% citation rate–even in medical journals–is really a desirable goal? For two reasons: 1. Articles can have influence without ever being cited, at least in theory – they could be taught in undergraduate courses, or shape someone’s thinking in a way that’s not really cite-able in a literature review. Probably most that have that level of influence would also get cited, though. But reason 2: maybe no one’s citing those articles because no one’s doing research building on them in… Read more »

Would love your thoughts, please comment.x