Scientific Journals in Library

Michigan State University

Innovating the Peer-Review Research Process

Researchers find ways to modernize time-intensive process that plagues many academics

| 2 min read
Register for free to listen to this article
Listen with Speechify
0:00
2:00
Using machine learning and implementing a feedback mechanism can improve the peer-review process for academics.
Michigan State University

A team of scientists led by a Michigan State University astronomer has found that a new process of evaluating proposed scientific research projects is as effective—if not more so—than the traditional peer-review method.

Normally, when a researcher submits a proposal, the funding agency then asks a number of researchers in that particular field to evaluate and make funding recommendations. A system that can sometimes be a bit bulky and slow—not quite an exact science.

Lab manager academy logo

Get training in Lab Quality and earn CEUs.

One of over 25 IACET-accredited courses in the Academy.

Certification logo

Lab Quality course

"As in all human endeavors, this one has it flaws," said Wolfgang Kerzendorf, an assistant professor in MSU's Departments of Physics and Astronomy, and Computational Mathematics, Science and Engineering.

Detailed in the publication Nature Astronomy, Kerzendorf and colleagues tested a new system that distributes the workload of reviewing project proposals among the proposers, known as the "distributed peer review" approach.

However, the team enhanced it by using two other novel features: Using machine learning to match reviewers with proposals and the inclusion of a feedback mechanism on the review.

Essentially, this process consists of three different features designed to improve the peer-review process.

First, when a scientist submits a proposal for evaluation, he or she is first asked to review several of their competitors' papers, a way of lessening the amount of papers one is asked to review.

Want to stay up to date on the latest lab management news?

Subscribe to our free Lab Manager Monitor Newsletter.

Is the form not loading? If you use an ad blocker or browser privacy features, try turning them off and refresh the page.

"If you lower the number of reviews that every person has to do, they may spend a little more time with each one of the proposals," Kerzendorf said.


Related Article: A Gait Analysis and Wake-Up Call to Peer Review Inefficiencies


Second, by using computers—machine learning—funding agencies can match up the reviewer with proposals of fields in which they are experts. This process can take human bias out of the equation, resulting in a more accurate review.

"We essentially look at the papers that potential readers have written and then give these people proposals they are probably good at judging," Kerzendorf said. "Instead of a reviewer self-reporting their expertise, the computer does the work."

And third, the team introduced a feedback system in which the person who submitted the proposal can judge if the feedback they received was helpful. Ultimately, this might help the community reward scientists that consistently provide constructive criticism.

"This part of the process is not unimportant," Kerzendorf said. "A good, constructive review is a bit of a bonus, a reward for the work you put in reviewing other proposals."

To do the experiment, Kerzendorf and his team considered 172 submitted proposals that each requested use of the telescopes on the European Southern Observatory, a 16-nation ground-based observatory in Germany.

The proposals were reviewed in both the traditional manner and using distributed peer review. The results? From a statistical standpoint, it was seemingly indistinguishable

However, Kerzendorf said this was a novel experiment testing a new approach to evaluating peer-review research, one that could make a difference in the scientific world.

"While we think very critically about science, we sometimes do not take the time to think critically about improving the process of allocating resources in science," he said. "This is an attempt to do this."

- This press release was originally published onMSU Today

Loading Next Article...
Loading Next Article...

CURRENT ISSUE - March 2025

Driving Lab Success Through Continuous Improvement

Embrace nonconforming work as opportunities for growth and improved lab performance

March 2025 Lab Manager Cover Image