Business and Management INK

Answers to 20 Questions About Interrater Reliability and Interrater Agreement

September 26, 2011 1563

James M. LeBreton, Purdue University, and Jenell L. Senter, Wayne State University, published “Answers to 20 Questions About Interrater Reliability and Interrater Agreement” in the October 2008 issue of Organizational Research Methods (ORM). It was the most-frequently read article of July 2011 for ORM, based on calculations from Highwire-hosted articles. Most-read rankings are recalculated at the beginning of the month and are based on full-text and pdf views. Professor LeBreton kindly provided the following responses to the article.

Who is the target audience for this article?

The primary audience is graduate students, faculty, and practitioners working in organizational psychology, human resource management, and organizational behavior. That said, I have received a number of e-mails asking about the paper from colleagues scattered across a wide array of social sciences.

Were there findings that were surprising to you?

Our particular paper was designed to test a priori hypotheses. Instead, we sought to synthesize and integrate roughly 30 years of thinking on issues related to interrater reliability and interrater agreement. So we did not have “findings”, per se. Instead we structured our paper as 20 key questions related to the use of interrater reliability and agreement statistics. We then did our best to answer these questions in a way that would provide others with a clear set of guidelines for using these statistics in their research and work.

How do you see this study influencing future practice?

We hope that our paper provides helpful guidelines for individuals using interrater reliability and agreement statistics in their practice. Practitioners often invoke these statistics when conducting organizational climate or culture studies, performance evaluation studies, or even when examining the quality of rating obtained via a panel of interviewers. Our paper was written to provide important information for practitioners and researchers to help guide a) the selection of the correct agreement or reliability statistic, 2) the correct estimation of the statistic, 3) the correct interpretation of the statistic, and 4) understand how various features of their situation might influence estimates of agreement/reliability (e.g., missing data; number of items on a scale, number of raters/judges).

How does this study fit into your body of work/line of research?

I have been publishing articles that use or refine estimates of interrater agreement and reliability for roughly 10 years. This paper represents an opportunity to reflect on my thinking over these years and integrate it with the thinking of my co-author (Jenell Wittmer-Senter) to arrive at a product that we believe will be helpful to both researchers and practitioners.

How did your paper change during the review process?

The most substantive changes involved providing a more balanced treatment of the rwg coefficient. This is the coefficient that I use in my work and thus we were probably a bit too laudatory in our evaluation. The revised paper presents both the pros and cons of rwg. I still think it is a great way to estimate agreement, it is not without its limitations. Those are now addressed more explicitly in our final paper. We also expanded the set of agreement coefficients we discussed to include awg, AD, and SD.

What, if anything, would you do differently if you could go back and do this study again?

As I noted above, this paper wasn’t structured as a traditional “research study.” Thus, there are particular design or analysis issues I would like to do differently. Overall I am quite pleased with the paper. I believe it has the potential to serve as a helpful resource to individuals wanting to estimate interrater agreement and/or interrater reliability. It was structured as a Q & A paper. We certainly didn’t address all possible questions related to agreement and reliability, but I hope we addressed some of the more pressing ones for individuals who are new to using these statistics.

To learn more about Organizational Research Methods, please click here.

Bookmark and Share

[polldaddy rating=”4667602″]

Business and Management INK puts the spotlight on research published in our more than 100 management and business journals. We feature an inside view of the research that’s being published in top-tier SAGE journals by the authors themselves.

View all posts by Business & Management INK

Related Articles

Interorganizational Design for Collaborative Governance in Co-Owned Major Projects: An Engaged Scholarship Approach
Business and Management INK
April 23, 2024

Interorganizational Design for Collaborative Governance in Co-Owned Major Projects: An Engaged Scholarship Approach

Read Now
Uncharted Waters: Researching Bereavement in the Workplace
Business and Management INK
April 22, 2024

Uncharted Waters: Researching Bereavement in the Workplace

Read Now
The Power of Fuzzy Expectations: Enhancing Equity in Australian Higher Education
Business and Management INK
April 22, 2024

The Power of Fuzzy Expectations: Enhancing Equity in Australian Higher Education

Read Now
How Do Firms Create Government Regulations?
Business and Management INK
April 18, 2024

How Do Firms Create Government Regulations?

Read Now
Challenging, But Worth It: Overcoming Paradoxical Tensions of Identity to Embrace Transformative Technologies in Teaching and Learning

Challenging, But Worth It: Overcoming Paradoxical Tensions of Identity to Embrace Transformative Technologies in Teaching and Learning

In this article, Isabel Fischer and Kerry Dobbins reflect on their work, “Is it worth it? How paradoxical tensions of identity shape the readiness of management educators to embrace transformative technologies in their teaching,” which was recently published in the Journal of Management Education.

Read Now
Data Analytics and Artificial Intelligence in the Complex Environment of Megaprojects: Implications for Practitioners and Project Organizing Theory

Data Analytics and Artificial Intelligence in the Complex Environment of Megaprojects: Implications for Practitioners and Project Organizing Theory

The authors review the ways in which data analytics and artificial intelligence can engender more stability and efficiency in megaprojects. They evaluate the present and likely future use of digital technology—particularly with regard to construction projects — discuss the likely benefits, and also consider some of the challenges around digitization.

Read Now
Putting People at the Heart of the Research Process

Putting People at the Heart of the Research Process

In this article, Jessica Weaver, Philippa Hunter-Jones, and Rory Donnelly reflect on “Unlocking the Full Potential of Transformative Service Research by Embedding Collaboration Throughout the Research Process,” which can be found in the Journal of Service Research.

Read Now
0 0 votes
Article Rating
Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Inline Feedbacks
View all comments