International Debate

Fixing Peer Review, a Biologist’s View

November 21, 2016 1483

scientificreviewThis year three Nobel Prize-winning biologists broke with tradition and published their research directly on the internet as so-called preprints. Their motivation? Saving time.

The Conversation logo

This article by Tricia Serio originally appeared at The Conversation, a Social Science Space partner site, under the title “Peer review is in crisis, but should be fixed, not abolished”

Traditionally, scientific studies are published in peer-reviewed journals, which require other scientists to evaluate submitted research to determine its soundness for publication. Peer review is supposed to be a good thing, in theory acting as a stopgap for science that isn’t sound, but it’s increasingly getting a bad rap. Beyond the time it takes to actually get the science done, peer review has become the slowest step in the process of sharing studies. Cycles of peer review-revise-resubmit in biology can span months to more than a year for a single manuscript. This situation hampers progress because it delays how long it takes for breakthroughs to become available to other scientists and the public.

How did things get so bad? It’s all about competition, supply and demand. Modern science is done in the context of a tournament mentality, with a large number of competitors (scientists) vying for a small number of prizes (jobs, tenure, funding). To be competitive, scientists must prove their “worth” through publications, and this pressure has created unanticipated challenges in how scientists report their own work and evaluate that of others – ultimately resulting in unacceptable delays in sharing sound science.

But trying to bypass this traditional route for sharing scientific results is not likely to advance scientific progress. As a journal editor and practicing scientist, I suggest we need to fix the real problem: our standards for publication. Done right, a recalibration would lead to fewer research papers – but that counterintuitive outcome may be exactly what’s needed to more efficiently advance scientific progress.

More money, more journals, more problems
Between 1995 and 2003, the U.S. National Institutes of Health’s budget increased by 2.4-fold. With more research being funded, publishers expanded the number of journals dedicated to biomedical research and the number of studies published by twofold, creating a US$9.4 billion scientific publishing industry.

But while the numbers have all increased in proportion, the quality has not. Scientific journals have a pecking order, and more “prestigious” journals are thought to have higher standards for publication. These standards are based on a hazy mix of perceived quality of the work, its potential to significantly influence thinking in the field and the possibly unfounded reputation of the journal itself.

How one ranks journal prestige is the subject of heated debate, but one flawed and pervasive metric is the impact factor. The impact factor of a journal reflects the number of times publications in that journal are cited by other scientific publications. It’s often used by other scientists as a shorthand measure of recognition of published work.

Between 1997 and 2014, the number of journals publishing basic biological research increased by 212, but only four of these journals ranked in the top half of the impact factor scale. If one overlooks the flaws of the metric, these new journals may be seen as publishing work of perhaps lesser quality and limited impact. Indeed, I was told by a senior colleague when I was just starting my career that “a manuscript, once written, will be published somewhere,” insinuating that the quality of the work was irrelevant.

The proliferation of “low impact” scientific journals has also expanded the “publish or perish” mantra of academia. It now matters not only how much you publish but also where you publish. This striving for exclusivity allows “top tier” journals to demand even more from scientists, who are willing to extend their studies beyond what was previously considered a standalone report (the so-called “least publishable unit”) for the prize of a “good” publication.

For example, one analysis revealed that journal manuscripts published in 2014 contained significantly more data than those published in 1984. Producing more data takes longer and delays the release of studies that would previously have been considered complete. For instance, Ph.D. students are spending an average of 1.3 years longer at one top graduate program over the same period.

And this high bar is elevated even further when individual journals reduce the numbers of studies that they publish.

Buried in a barrage of papers
The overall increased number of papers being written has also created a bottleneck in peer review, which negatively affects both quality and speed of publication.

I spend most of my editorial time trying to recruit qualified reviewers, who are increasingly too busy to fulfill this professional responsibility. There’s no restriction on how far down the list I’m permitted to go in my attempts. When I receive invitations myself, they now often give me the option to choose people in my lab group to complete the review on their own, expanding the scope of “peer” to include “student.” I have also recently been invited, with no obvious check of my credentials, to join a service that will pay me to review manuscripts, a divergence from the norm, where reviewing papers has traditionally been considered part of an academic’s responsibility to the field and thus unpaid.

With this erosion of the peer review system, spectacular failures are inevitable, such as the study crediting a divine “Creator” for the link between the structure of the hand and its grasping ability in a peer-reviewed publication.

Even without the explosion of preprints that may be on the horizon, scientists are having a hard time keeping up with the literature as it is. In a survey by the magazine The Scientist on the prevalence of omitted references, 85 percent of respondents said the failure to cite previous studies in new publications is a serious or potentially serious problem in the life sciences. This slip in keeping current may lead to the persistence of incorrect conclusions and to duplicated and therefore wasted effort. I recently reviewed a manuscript and pointed out that the vast majority of what was reported had been previously published, although none of the three other reviewers made this connection.

Thus, calls for self-publishing need to take scale into account; it may work for physics and mathematics, but in 2015 there were sixfold and 24-fold more manuscripts published in biology than in either field, respectively.

Keeping sight of the goal: Facilitating scientific progress
Without question, scientific advances, funded by the public, should be shared without delay, a goal championed by the #ASAPbio movement. Indeed, reporting observations quickly for other scientists to use may seem like a good way to facilitate progress; but in reality, context is everything. There’s simply no way to remember the vast number of details if they’re not associated with a breakthrough in understanding. It’s these breakthroughs that provide a framework for not only organizing the details but vetting their accuracy. As a practical example, I know what I was wearing (detail) on Oct. 28, 2007, the day my son was born (context), but I have no idea what I wore (out-of-context detail) on Oct. 27.

To realize a faster pace of scientific progress, we need to balance the goal of sharing data with an assessment of quality and impact. Proponents of self-publication on internet servers such as bioRxiv suggest that scientists are so concerned with their reputations that they will not release unsound studies, but the increasing prevalence of retracted peer-reviewed articles, irreproducible results and text reuse argues that the pressures of the tournament can sometimes trump individual restraint.

Peer review clearly isn’t perfect, but rather than simply bypassing it and releasing even more information into an overloaded system, we should focus on making it better. The first step is to reset and clearly state our standards for quality in both publishing and peer reviewing. The outcome will certainly be fewer publications in biomedicine, but their individual impact will be greater. As a result, scholars will have a fighting chance to dedicate more time to evaluating new research and keeping up with the literature, which will facilitate progress.

Scientists and journals have driven the more-is-better mentality and don’t have the incentives to make these corrections. Instead, universities and granting agencies, which use publications as standards for evaluation, and the public, which funds much of the research, must lead the charge to develop a mechanism for journal accreditation, with clear standards for publication and peer-review quality. If publishing scientific advances is worth doing, it is worth doing right.
The Conversation


Tricia Serio is professor and department head in molecular and cellular biology at the University of Arizona. She is an associate editor at PLOS Genetics and guest editor at PLOS Biology and PNAS.

View all posts by Tricia Serio

Related Articles

From the University to the Edu-Factory: Understanding the Crisis of Higher Education
Industry
November 25, 2024

From the University to the Edu-Factory: Understanding the Crisis of Higher Education

Read Now
Canada’s Storytellers Challenge Seeks Compelling Narratives About Student Research
Communication
November 21, 2024

Canada’s Storytellers Challenge Seeks Compelling Narratives About Student Research

Read Now
Deciphering the Mystery of the Working-Class Voter: A View From Britain
Insights
November 14, 2024

Deciphering the Mystery of the Working-Class Voter: A View From Britain

Read Now
Julia Ebner on Violent Extremism
Insights
November 4, 2024

Julia Ebner on Violent Extremism

Read Now
Ninth Edition of ‘The Evidence’: Tackling the Gender Pay Gap 

Ninth Edition of ‘The Evidence’: Tackling the Gender Pay Gap 

This month’s installment of The Evidence kicks off Gloria Media’s annual equal pay campaign. Starting from November 8, the average French woman […]

Read Now
The Conversation Podcast Series Examines Class in British Politics

The Conversation Podcast Series Examines Class in British Politics

Even in the 21st century, social class is a part of being British. We talk of living in a post-class era but, […]

Read Now
Emerson College Pollsters Explain How Pollsters Do What They Do

Emerson College Pollsters Explain How Pollsters Do What They Do

As the U.S. presidential election approaches, news reports and social media feeds are increasingly filled with data from public opinion polls. How […]

Read Now
0 0 votes
Article Rating
Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Newest
Oldest Most Voted
Inline Feedbacks
View all comments