Impact

The Changing Imperative to Demonstrate Social Science Impact

May 29, 2019 2765

Over the past five years, discussion over the impact of social science research and how it is measured and understood has become increasingly complex, controversial and contested. A condition that can be succinctly summarized in three quotes:

LSE-impact-blog-logo
This article by Ziyad Marar originally appeared on the LSE Impact of Social Sciences blog as The changing imperative to demonstrate social science impact and is reposted under the Creative Commons license (CC BY 3.0).

As sociologist and Microsoft Research principal researcher Duncan Watts recently observed: Measurement is a tremendous driver of science.” When you measure things, your conception of the issues – and therefore the solutions – changes. “Measurement,” he adds, “casts things in a completely different light.”

On the other hand, as the quote often attributed to Einstein goes, not everything that can be counted counts, and not everything that counts can be counted.”

And finally we have Goodhart’s law (named after economist Charles Goodhart) often expressed as “when a measure becomes a target, it ceases to be a good measure.”

Put in other words, measuring and demonstrating social science research is political. Firstly, in the sense that measurement and the attribution of value can shape research practices themselves, but also in that the value they describe is inherently limited and if misapplied can lead to unintended consequences. SAGE Publishing has often had a role in this conversation in particular, as a convener of more expert voices.

Five years of progress

Five years ago, we published The Impact of Social Sciences with the LSE, the culmination of a three-year project focused on simply beginning to understand the impact of the social science enterprise in the UK. The next year, we published The Metric Tide, an independent assessment of the role of metrics in assessing research. Important though these studies were, in some ways they only represent the beginning of this conversation. For instance, The Metric Tide review was announced in April 2014 by the then Minister for Universities and Science, David Willets; the next Research Excellence Framework (REF 2021) will increase the weight it places on impact. Furthermore, entirely new impact measures, such as the Knowledge Exchange Framework (KEF), have also been introduced.

In the United States, claims that social science has no real impact have been used repeatedly in political attacks that attempt to cut social science funding – even as social science was being deployed effectively in questions of finance and national security that obsessed these same critics.

The National Science Foundation now also requires “broader impacts” to be explicit in the research it funds. Foundations increasingly require clear and direct demonstrations of impact from their grantees. Even the Pentagon has gotten into the game: Its Defense Advanced Research Projects Agency has tasked social scientists to create an artificial intelligence system, known as SCORE, to quantitatively measure the reliability of social science research and “thereby increase the effective use of [social and behavioral science] literature and research to address important human domain challenges.”

At the same time, an explosion of data has blown open new doors for social and behavioral research and pointed to many tools that could measure its impact. In 2014, Altmetric had only existed for two years; now this service and a range of other digital tools such as those provided by Clarivate and Google Scholar have become firm features in the research landscape.

However, despite these advances, a key unresolved issue in this debate is that no consensus has emerged within the social sciences, or beyond, as to how social science impact can be demonstrated. This void has allowed other disciplines to set the agenda for social sciences. A new survey of faculty at four U.S. universities from the Association of College & Research Libraries () finds that social science researchers are shifting their conceptions of demonstrating impact toward “ways more aligned with the Sciences and Health Sciences” (and away from those favored in the arts and humanities).

This is problematic since impact measurements appropriate for astrophysics won’t read across well for anthropologists. Writing in Research Evaluation, the Italian National Research Council’s Emanuela Reale and her colleagues argue that the predominant methods used by natural scientists to demonstrate impact “tend to underestimate” the value of social science research because of time lags, methodological variety and social science’s interest in new approaches, rather than solely iterative ones.

Meanwhile, the same ACRL survey finds that across the academy, faculty are generally unaware of ways to measure impact apart from citations (in presumably high Impact Factor journals). This narrow understanding misses the value of alternative metrics, but also work outside a scholar’s discipline or beyond a single paper. Reale and her colleagues noted a similar “strong orientation of the scholars’ efforts towards considering scientific impact as a change produced by a single (or a combination of) piece(s) of research, with a limited interest in deepening conditions of the research processes contributing to generating an impact in the interested fields.” This evidence strongly suggests the need to return to previous discussions on demonstrating social science impact, but this time in a way that is inclusive of the scholarly community at large, as well as those outside of it.

Reopening the debate

Taking this into account, we are looking to reignite some of these debates and stimulate fresh thinking on the impact of social science. Earlier this year, SAGE assembled a working group to share ideas for helping scholars navigate the slippery concept of impact and the shortcomings of established metrics. As part of this effort we have produced a white paper highlighting the findings of this group, which is now available. The report maps out stakeholder categories, defines key terms and questions, puts forward four models for assessing impact, and presents a list of 45 resources and data sources that could help in creating a new impact model. It also establishes imperatives and recommended actions to improve the measurement of impact including:

  • recognition from the community that new impact metrics are useful, necessary, and beneficial to society;
  • establishing a robust regime of measurement that transcends but does not supplant literature-based systems;
  • coming to a shared understanding that although social science impact is measurable like STEM, its impact measurements are unlikely to mirror STEM’s;
  • and creating a global vocabulary, taxonomy, global metadata, and a global set of benchmarks for talking about measurement.

To keep the conversation alive for the longer-term, we have also developed a new impact section of the SAGE-sponsored community site Social Science Space. (This space itself is also being used to gather ideas, amplify diverse opinions, and engage in debate about impact with global actors engaged on the topic.) We are also hosting events to share progress and crowd source ideas kicking off with a May event in Washington, DC and a June event in Vancouver, Canada.


Ziyad Marar is an author and president of global publishing at SAGE Publishing. His books include Judged: The Value of Being Misunderstood (Bloomsbury, 2018), Intimacy: Understanding the Subtle Power of Human Connection (Acumen Publishing, 2012), Deception (Acumen Publishing, 2008), and The Happiness Paradox (Reaktion Books 2003). He tweets @ZiyadMarar.

View all posts by Ziyad Marar

Related Articles

Research Assessment, Scientometrics, and Qualitative v. Quantitative Measures
Impact
September 23, 2024

Research Assessment, Scientometrics, and Qualitative v. Quantitative Measures

Read Now
Paper to Advance Debate on Dual-Process Theories Genuinely Advanced Debate
Impact
September 18, 2024

Paper to Advance Debate on Dual-Process Theories Genuinely Advanced Debate

Read Now
Webinar: Fundamentals of Research Impact
Event
September 4, 2024

Webinar: Fundamentals of Research Impact

Read Now
Paper Opening Science to the New Statistics Proves Its Import a Decade Later
Impact
July 2, 2024

Paper Opening Science to the New Statistics Proves Its Import a Decade Later

Read Now
A Milestone Dataset on the Road to Self-Driving Cars Proves Highly Popular

A Milestone Dataset on the Road to Self-Driving Cars Proves Highly Popular

The idea of an autonomous vehicle – i.e., a self-driving car – isn’t particularly new. Leonardo da Vinci had some ideas he […]

Read Now
Why Social Science? Because It Can Help Contribute to AI That Benefits Society

Why Social Science? Because It Can Help Contribute to AI That Benefits Society

Social sciences can also inform the design and creation of ethical frameworks and guidelines for AI development and for deployment into systems. Social scientists can contribute expertise: on data quality, equity, and reliability; on how bias manifests in AI algorithms and decision-making processes; on how AI technologies impact marginalized communities and exacerbate existing inequities; and on topics such as fairness, transparency, privacy, and accountability.

Read Now
Digital Scholarly Records are Facing New Risks

Digital Scholarly Records are Facing New Risks

Drawing on a study of Crossref DOI data, Martin Eve finds evidence to suggest that the current standard of digital preservation could fall worryingly short of ensuring persistent accurate record of scholarly works.

Read Now
5 1 vote
Article Rating
Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Newest
Oldest Most Voted
Inline Feedbacks
View all comments