Infrastructure

Could Distributed Peer Review Better Decide Grant Funding?

October 20, 2025 2282

The landscape of academic grant funding is notoriously competitive and plagued by lengthy, bureaucratic processes, exacerbated by difficulties in finding willing reviewers. Distributed peer review (DPR) offers an alternative: in this model, applicants themselves review other proposals within the same funding call.

recent study by the Research on Research Institute, conducted in partnership with the Volkswagen Foundation, compared DPR to traditional panel review for the same funding proposals to evaluate DPR’s robustness, benefits, and challenges. We found that while the two methods show some overlap in their outcomes, they present distinct advantages and challenges for both researchers and funders.

A flow chart showing stages of distributed peer review.
Panel review and DPR processes as implemented at Volkswagen Foundation

Comparing funding outcomes and review scores

At a high level, the study found a modest overlap in the final decisions for the funding call. Of the 10 proposals recommended by DPR and 11 by panel review, three were selected by both. There was however greater agreement earlier in the process; nine out of 10 of the DPR-funded proposals were shortlisted for panel review, and eight out of 10 went on to be discussed in the panel meeting, suggesting the most promising research ideas were identifiable by both processes.

Trimmed mean of all proposals (with 95 percent confidence intervals) according to rank under DRP and funding status. For the purposes of this visualization, when ranks were tied proposals were ordered by standard error. Any remaining ties were broken by proposal order.

While the final funding decisions were different between the DPR and the panel processes, interestingly, the weighting given to different criteria in assessing the applications by DPR and panel reviewers showed little variation. Both groups consistently prioritized originality, suggesting a shared understanding of what makes a proposal compelling.

LSE-impact-blog-logo
This article by Melanie Benson Marshall, Anna Butters, Stephen Pinfield and Tom Stafford originally appeared on the LSE Impact of Social Sciences blog as “Distributed Peer Review – How the wisdom of the crowd can allocate grant funding.”

A major distinction, however, lies in the nature of the decision-making. The DPR process is built on isolated, individual assessments, where each reviewer makes a judgment independently. This contrasts traditional panel review, where reviewers engage in real-time discussion and debate, potentially allowing for a more nuanced understanding of each proposal. While many panel members viewed this as a positive, collaborative feature, it also introduces a layer of complexity, where interpersonal dynamics, dominant personalities, and group consensus can influence outcomes. The DPR process, by its nature, sidesteps these social influences, making each decision an independent one.

Advantages of Distributed Peer Review

The study identified several major benefits of the DPR process that address many of the pain points of traditional grant funding.

1. A Faster, More Efficient Process: One of the most significant advantages is the dramatic reduction in decision-making time. The traditional panel review studied here can be slow and sequential, with shortlisting, individual reviews, and panel meetings taking a total of 13-14 weeks. DPR streamlines these steps, with all reviewing occurring consecutively over a period of just six weeks.

2. More Abundant and Diverse Feedback: The DPR system also provides applicants with a richer, more diverse array of feedback. Because the review burden is distributed among a large pool of applicants, each proposal receives more attention and a wider range of perspectives. We found that a single proposal received a median of 12.33 hours of review time under DPR, compared to between 45 minutes to four hours typically spent in panel review. For applicants, this feedback was valued as a tool for learning and improving future submissions.

3. Enhanced Fairness and Transparency: Many participants perceived DPR as fairer and more democratic. By distributing the review responsibility more broadly and through more structured, anonymised mechanisms, DPR can reduce the impact of personal biases and established academic hierarchies. This enhanced perception of fairness builds greater trust and legitimacy within the research community, as decisions are seen as more objective and equitable.

4. A Valuable Learning Experience: For many participants, particularly early-career researchers (ECRs), participating in the DPR process was a valuable learning opportunity. It provided invaluable insights into what makes a strong proposal. This sense of empowerment and ownership was highly valued, as was the feeling of being reviewed by ‘true peers’ at a similar career stage. The innovative nature of DPR was also a draw for many participants, injecting enthusiasm into what can often be a conservative, bureaucratic process.

5. A Solution to Reviewer Recruitment: From a funder’s perspective, DPR solves the persistent challenge of recruiting enough qualified reviewers. The process creates a built-in pool of experts from the applicant base itself, relieving some of the administrative burden of coordinating the review process.

Challenges of Distributed Peer Review

Despite these compelling advantages, the DPR process is not without challenges.

1. Increased Workload for Applicants: A prominent concern is the additional workload placed on applicants. While most did not find the burden excessive, it was alleviated by the reduced length of the proposals in this specific study. The overall time and effort required should therefore be carefully considered for future implementations.

2. Lack of Discussion: A key drawback compared to traditional panels is the absence of discussion and clarification. The isolated nature of DPR means reviewers cannot ask probing questions or collectively deliberate on a proposal’s nuances. This could lead to a less refined evaluation, although it also avoids the negative social dynamics and conformity biases that can influence panel discussions.

3. Concerns Over Reviewer Expertise: Applicants, panel members and Volkswagen Foundation staff voiced concerns that reviewers, especially those new to the process, might be inexperienced or lack expertise. While a larger pool offers diverse perspectives, it may not provide the deep disciplinary knowledge of seasoned panel members. However, we found that, on average, panel reviewers and applicants were equally confident in their expertise as reviewers.

4. Potential for Gaming: A persistent concern is the potential for applicants to ‘game’ the system by giving rival proposals unfairly low scores. While this did not appear to be a widespread issue in this study, aided by a ‘trimming’ process that excluded outlier scores, it is a risk inherent to a system based on direct competition. There are other possible solutions to the gaming problem, such as splitting applications into two pools, where reviewers only assess proposals in a different pool from their own application. This approach seems to work best where proposals are similar.

Distributed peer review offers a viable and promising alternative to traditional panel review. While it introduces new challenges, such as increased workload for applicants and a lack of direct discussion, its benefits are compelling. DPR is faster, provides more detailed and diverse feedback, and is perceived as fairer and more democratic. Providing  a ready-made pool of reviewers and offering a valuable learning experience for the research community resolves many challenges facing traditional grant funding allocation.

DPR is, however, different. In thinking about it, there may be a temptation to regard conventional decision-making in a panel as normative and therefore see DPR as problematic. Instead, we should perhaps think of it as doing different things. It is a move from the wisdom of the gatekeeper to the wisdom of the (expert) crowd, and so we should perhaps expect outcomes to be different.

Phase two of this project has now begun, and will address some of the findings from phase one as well as expanding to see how DPR might translate to other funders and applications. For those considering implementation, we have produced a practical guide to DPR for funders. As the academic landscape continues to evolve, DPR has the potential to foster a more dynamic, transparent, and efficient system for distributing research funding.

Melanie Benson Marshall and Anna Butters are a research associates in the School of Information, Journalism and Communication at the University of Sheffield and research fellows at the Research on Research Institute. Stephen Pinfield is professor of information services management in the School of Information, Journalism and Communication at the University of Sheffield, and senior research fellow at the Research on Research Institute. Tom Stafford is a professor of cognitive science at the University of Sheffield and senior research fellow at the Research on Research Institute. He writes a newsletter about the psychology of reasoning and decision making at tomstafford.substack.com/.

View all posts by Melanie Benson Marshall, Anna Butters, Stephen Pinfield and Tom Stafford

Related Articles

A Box Unlocked, Not A Box Ticked: Tom Chatfield on AI and Pedagogy
Teaching
December 1, 2025

A Box Unlocked, Not A Box Ticked: Tom Chatfield on AI and Pedagogy

Read Now
Is the Dissertation Still Considered a Rite of Passage?
Infrastructure
November 17, 2025

Is the Dissertation Still Considered a Rite of Passage?

Read Now
An Introduction: After the University?
Higher Education Reform
November 5, 2025

An Introduction: After the University?

Read Now
CASBS Welcomes 2025-26 Cohort of Fellows
Announcements
October 2, 2025

CASBS Welcomes 2025-26 Cohort of Fellows

Read Now
Has Bad Science Become Big Busines

Has Bad Science Become Big Busines

Researchers are dealing with a disturbing trend that threatens the foundation of scientific progress: scientific fraud has become an industry. And it’s […]

Read Now
SSRC Stands Up Economic Research Rescue Fund for Researchers Hurt by NSF Cuts

SSRC Stands Up Economic Research Rescue Fund for Researchers Hurt by NSF Cuts

The non-profit Social Science Research Council is offering a lifeline to economists whose existing grants from the U.S. National Science Foundation were […]

Read Now
What You Can Do As Data U.S. Taxpayers Paid For and Use Disappears

What You Can Do As Data U.S. Taxpayers Paid For and Use Disappears

People rely on data from federal agencies every day – often without realizing it. Rural residents use groundwater level data from the […]

Read Now
0 0 votes
Article Rating
Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Newest
Oldest Most Voted
Inline Feedbacks
View all comments