Impact

Paper on Amazon’s Mechanical Turk Proves a Durable Article

August 5, 2022 2221

Being the first, or at least among the first, is generally an advantageous position in most endeavors. And so it proves in SAGE Publishing’s annual 10-Year Impact Awards, where a 2011 paper on Amazon’s new and innovative Mechanical Turk, which among other things crowdsources prospective participants for social and behavioral research via an online marketplace, has garnered 7,500 citations in the subsequent decade. 

That makes it the most-cited paper appearing in a SAGE-published journal in 2011. SAGE (the parent of Social Science Space) started the 10-Year Impact Awards in 2020 as one way to demonstrate the value of social and behavioral science. While article citations and journal impact factors are the standard measure of literature-based impact in academia, these measures’ two- or five-year window don’t account for papers whose influence grows over time or that are recognized at a later date. This is especially acute in the social sciences, where impact factors “tend to underestimate” the value of social science research because of time lags and social science’s interest in new approaches, rather than solely iterative ones.  

One such new approach was MTurk, as the Amazon platform is known. “Amazon’s Mechanical Turk: A New Source of Inexpensive, Yet High-Quality, Data?” in Perspectives on Psychological Science describes the then-potential contributions of MTurk to the social sciences. The authors — Michael Buhrmester, Tracy Kwang and Samuel D. Gosling, all then at the University of Texas, Austin, Department of Psychology – found that “overall, MTurk can be used to obtain high-quality data inexpensively and rapidly.” 

As part of series talking with the authors of this year’s winners, we asked Buhrmester, the lead author of the paper, to reflect on the impact this article has had in the decade since it appeared. 

Michael Buhrmester

In your estimation, what in your research – and obviously the published paper – is it that has inspired others or that they have glommed onto?  

The paper, along with several others that were published around the same time, served a few very practical purposes for researchers. First, it introduced a relatively new at the time platform for collecting data online with a relatively low barrier of entry. Collecting data online in the pre-MTurk era was certainly possible for researchers, but there wasn’t an efficient ‘plug and play’ approach like what MTurk offered. Second, our evaluation of the quality of the data at that period of time in essence suggested that it was as defensible a source of data as other common sources. Along with other evaluations coming to similar conclusions, I believe this had the effect of reducing some unfounded skepticism and fears of online data collection methods generally. Last, and most importantly, I believe the paper helped spark a more substantial, continuous evaluation of not just MTurk and platforms like it, but of all the major data collection methods utilized by social scientists.  

What, if anything, would you have done differently in the paper (or underlying research) if you were to go back in time and do it again?  

I’ve found that the paper is often cited as a comprehensive defense for any use of MTurk for data collection, and that’s a shame because there’s been so many more-thorough evaluations in the ensuing years. These evaluations have uncovered a host of issues and solutions, many of which would’ve been tough to anticipate. However, I do wish we’d have more actively conveyed the equivalent of “Warning: Conditions May Change Quickly” to encourage researchers to get the most up-to-the-minute evaluations.  

What direct feedback – as opposed to citations – have you received in the decade since your paper appeared?  

Over the years, I’ve personally responded to well over 1,000 emails from researchers at all career stages from all over the world covering just about any MTurk-related question one could imagine. I’ve learned (and forgotten) more about MTurk than I’d imagined while drafting the manuscript.  

 How have others built on what you published? (And how have you yourself built on it?)  

Beyond the proliferation of other online data platform evaluations, ambitious teams have built better mouse traps over the years (e.g., Prolific). Beyond being a part of the community of researchers engaged in issues related to online methodologies, my program of research largely focused on my true passion and area of expertise – uncovering the causes and consequences of identity fusion with my grad advisor, Bill Swann, and post-doc advisor, Harvey Whitehouse.   

Could you name a paper (or other scholarly work) that has had the most, or at least a large, impact on you and your work? 

Bill Swann’s seminal work on Self-Verification Theory (pick any paper from the late 70s to early 90s) is why I became a social psychologist and is a foundation upon which pretty much all of my work (a lot of it with Bill!) rests.  

Lastly, since it seems you have all gone on different routes since writing the paper, I’d love to know what you have done in the last decade. 

After a fruitful and adventure-filled post-doc working jointly at the University of Texas and Oxford, a couple years ago I took the leap out of academia. I now work as a quantitative researcher at Gartner, applying social psychology to the world of generating actionable insights for business leaders. 


An interview with the lead author of the third most-cited article in the 10-year Impact Awards, on the Danish National Patient Register, appears here.

Sage, the parent of Social Science Space, is a global academic publisher of books, journals, and library resources with a growing range of technologies to enable discovery, access, and engagement. Believing that research and education are critical in shaping society, 24-year-old Sara Miller McCune founded Sage in 1965. Today, we are controlled by a group of trustees charged with maintaining our independence and mission indefinitely. 

View all posts by Sage

Related Articles

Three Decades of Rural Health Research and a Bumper Crop of Insights from South Africa
Impact
March 27, 2024

Three Decades of Rural Health Research and a Bumper Crop of Insights from South Africa

Read Now
Using Translational Research as a Model for Long-Term Impact
Impact
March 21, 2024

Using Translational Research as a Model for Long-Term Impact

Read Now
2024 Holberg Prize Goes to Political Theorist Achille Mbembe
News
March 14, 2024

2024 Holberg Prize Goes to Political Theorist Achille Mbembe

Read Now
AAPSS Names Eight as 2024 Fellows
Announcements
March 13, 2024

AAPSS Names Eight as 2024 Fellows

Read Now
Britain’s Academy of Social Sciences Names Spring 2024 Fellows

Britain’s Academy of Social Sciences Names Spring 2024 Fellows

Forty-one leading social scientists have been named to the Spring 2024 cohort of fellows for Britain’s Academy of Social Sciences.

Read Now
Norman B. Anderson, 1955-2024: Pioneering Psychologist and First Director of OBSSR

Norman B. Anderson, 1955-2024: Pioneering Psychologist and First Director of OBSSR

Norman B. Anderson, a clinical psychologist whose work as both a researcher and an administrator saw him serve as the inaugural director of the U.S. National Institute of Health’s Office of Behavioral and Social Sciences Research and as chief executive officer of the American Psychological Association, died on March 1.

Read Now
New Feminist Newsletter The Evidence Makes Research on Gender Inequality Widely Accessible

New Feminist Newsletter The Evidence Makes Research on Gender Inequality Widely Accessible

Gloria Media, with support from Sage, has launched The Evidence, a feminist newsletter that covers what you need to know about gender […]

Read Now
0 0 votes
Article Rating
Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Inline Feedbacks
View all comments