Higher Education Reform

Crowd-Sourcing As a Complement to Peer Review

September 13, 2018 1965
The graphic from Open Review Toolkit illustrates the expected parallel nature of the Open Review process.

As Princeton sociologist Matthew Salganik (and current professor in residence at The New York Times) prepared his new book, Bit by Bit: Social Research in the Digital Age, for public consumption, he in turn was consumed by its central conceit. Namely, “how the digital age creates new opportunities for academic authors and publishers.”

Salganik explains on his blog, Wheels on the Bus:

The more I thought about it, the more it seemed that we could publish academic books in a more modern way by adopting some of the same techniques that I was writing about.  I knew that I wanted Bit by Bit to be published in this new way, so I created a process called Open Review that has three goals: better books, higher sales, and increased access to knowledge.  Then, much as doctors used to test new vaccines on themselves, I tested Open Review on my own book.

As the “review” part of the name suggests, Open Review is a form of peer review, albeit more free-form and less restricted to a curated panel of reviewers. “During Open Review,” Salganik wrote, “anyone in the world could come and read the book and annotate it using hypothes.is, an open source annotation system.” The system is pretty unique, although there are some other projects advancing along similar crowd-sourcing lines, such as bookdown and gitbooks.

He points to other books that have had a crowdsource aspect, such as Lawrence Lessig’s Code 2.0, which was written in part using a wiki. “One book that really inspired me to try this is Michael Nielsen’s book Reinventing Discovery,” Salganik writes. “And, it seems that he is putting his new book on deep learning through a similar process. Kathleen Fitzpatrick did something similar with her book Planned Obsolescence: Publishing, Technology, and the Future of the Academy.”

Open Review doesn’t abandon traditional peer review, we should note. Instead, in his test, Salganik submitted Bit by Bit to traditional peer review and created a website and converted the text into a standalone website that allowed Open Review to occur separately yet simultaneously. (Or perhaps even post-simultaneously – Open Review ends when the author submits the final manuscript to the publisher, and Salganik’s 2016 book remains available for review scrutiny even now.)

The Open Review Toolkit website addresses the question of whether this process is meant as a replacement to traditional peer review:

It is better to think of Open Review as a complement to traditional peer review rather than a substitute. Rather than having a small number of experts read an entire manuscript, the Open Review process involves a large number of people with different skills and interests engaging with smaller parts of the manuscript. The two processes result in very different types of feedback.

Salganik wrote three posts examining his Open Review experiences for Bit by Bit, dissecting each one of those goals – quality, sales and access — in turn. While he found the annotation portion particularly useful, the process also collected “implicit feedback” or “reader analytics,” such as the traffic and abandonment rate of each section of the book. It also collected intelligence that will prove useful in marketing the book, even if, as he admits, academic authors don’t usually talk about stuff like this.

Thanks to a $20,000 grant from the Alfred P. Sloan Foundation, Salganik has opened the door to other authors to use Open Review for their own manuscript. You can access the Open Review Toolkit by either downloading the open-source code or hiring one of the preferred partners.

jimi adams
jimi adams

One of the first authors to take the plunge is adams, an associate professor of health and behavioral sciences at the University of Colorado Denver. His upcoming book for SAGE Publishing’s Quantitative Applications in the Social Sciences series (the famed “little green books”) has been submitted to the Open Review process. The manuscript, Gathering Social Network Data, is now live here: https://www.getsndata.com/.

Given that the book is about social networks, it’s fitting that it is an early adopter of Open Review. As adams writes in his acknowledgement section, “Any project, no matter how long, benefits from feedback of a variety of sorts, and this book is no different.”

adams said his decision to use Open Review followed from his acquaintance with and respect for Salganik. The two talked at the recent American Sociological Association, and when Salganik learned adams had submitted Gathering to peer review, he suggested trying out Open Review, too. Some of the conversation, adams recounts, focused on the technical aspects of using the site — the Toolkit’s home page includes the disclaimer that the system “is not yet as easy as we want it to be.” But Salganik’s enthusiasm won him over, adams explained to Social Science Space:

[F]or my purposes the ideas that were particularly compelling were: (1) getting some “advance advertising” for the book, which he’s convinced will improve sales. That’s a tricky one to actually evaluate, but in conversations I was having with other folks (including a number of editors at ASA) they seemed mostly to buy that as plausible. (2) The peer review process often focuses on the accuracy of ideas more than on how clearly they’re communicated. So an Open Review process has the possibility of getting people outside the experts or direct audience of a book to take a quick look at (parts of) the material and give feedback on what makes sense to them, what doesn’t, what could be said more clearly, etc.

As a first-time book writer, even for something only in the range of 40,000 words, I found the opportunity to get that type of feedback especially exciting. Salganik said he really thought the resulting book for him was much clearer because of this process, and that is never a bad thing. So I’m hoping that works for mine as well.

Related Articles

The Power of Fuzzy Expectations: Enhancing Equity in Australian Higher Education
Business and Management INK
April 22, 2024

The Power of Fuzzy Expectations: Enhancing Equity in Australian Higher Education

Read Now
Using Translational Research as a Model for Long-Term Impact
Impact
March 21, 2024

Using Translational Research as a Model for Long-Term Impact

Read Now
Why Don’t Algorithms Agree With Each Other?
Innovation
February 21, 2024

Why Don’t Algorithms Agree With Each Other?

Read Now
Addressing the United Kingdom’s Lack of Black Scholars
Higher Education Reform
February 8, 2024

Addressing the United Kingdom’s Lack of Black Scholars

Read Now
A Black History Addendum to the American Music Industry

A Black History Addendum to the American Music Industry

The new editor of the case study series on the music industry discusses the history of Black Americans in the recording industry.

Read Now
Research Integrity Should Not Mean Its Weaponization

Research Integrity Should Not Mean Its Weaponization

Commenting on the trend for the politically motivated forensic scrutiny of the research records of academics, Till Bruckner argues that singling out individuals in this way has a chilling effect on academic freedom and distracts from efforts to address more important systemic issues in research integrity.

Read Now
When University Decolonization in Canada Mends Relationships with Indigenous Nations and Lands

When University Decolonization in Canada Mends Relationships with Indigenous Nations and Lands

Community-based work and building and maintaining relationships with nations whose land we live upon is at the heart of what Indigenizing is. It is not simply hiring more faculty, or putting the titles “decolonizing” and “Indigenizing” on anything that might connect to Indigenous peoples.

Read Now
0 0 votes
Article Rating
Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Inline Feedbacks
View all comments