Impact

Can We Use Altmetrics to Measure Societal Impact?

April 9, 2019 1974

Measuring sticks
(Image: Frankieleon/Flickr)

Many benefits emerge from academic research, and they have impact on stakeholders in diverse ways. Impact is in this way a multi-faceted phenomenon, which raises the following question; what are the most informative tools to track these different outcomes?

LSE-impact-blog-logo
This article by Lutz Bornmann and Robin Haunschild originally appeared on the LSE Impact of Social Sciences blog as “Are altmetrics able to measure societal impact in a similar way to peer review?” and is reposted under the Creative Commons license (CC BY 3.0).

When quantitative approaches to research evaluation were first trialed at the end of the 1980s, they mostly drew on publication and citation data (bibliometrics) and they mostly targeted academic impact – the impact of research on other academics. More highly-cited work was taken as an indicator of research ‘excellence’, which was widely pursued as a public policy goal. Academic research excellence remains important, but the policy agenda has shifted, notably since the introduction of societal impact considerations into the UK’s Research Excellence Framework (REF). However, assessing the nature, scale, and beneficiaries of research impact, especially quantitatively, remains a complex undertaking.

One potential way of quantitatively assessing societal impact has been through altmetrics – online indicators of research use. In a recent study, based on data from the UK REF, we therefore decided to examine the extent to which altmetrics are able to measure societal impact in a way similar to the peer review of case studies. We found a relationship, but not one that provides a firm indicator.

Fortunately, data for REF2014 are available for comprehensive studies. One key feature of the review process is that we have two distinct and therefore comparable types of publications being submitted: (i) evidence of academic achievement based on four selected outputs per researcher and (ii) evidence of socio-economic impact based on case studies with six underpinning references.

Drawn from …
This blog post is based on the authors’ co-written article “Do altmetrics assess societal impact in a comparable way to case studies? An empirical test of the convergent validity of altmetrics based on data from the UK research excellence framework (REF)” in the Journal of Informetrics.

Our study focused on those items submitted to REF2014 that can be uniquely identified via DOIs (Digital Object Identifiers): essentially, journal papers rather than other output types. For journal papers, we can also acquire impact and attention data: citation counts and media mentions of various kinds. We anticipated that the impact of papers submitted for academic outputs would be significantly different from the impact of references cited in case studies: the former should be strong in academia, but weak in other sectors of society, whereas, the opposite should be true for the latter.

For our analysis, the test prediction was that papers that were submitted as evidence of academic achievement would be relatively well-cited compared to papers that supported case studies. By contrast, the papers supporting case studies might be relatively less well-cited but would have wider societal recognition, trackable through Altmetric.com data (sourced from Twitter, Wikipedia, Facebook, policy-related documents, news items, and blogs). If we discovered that there was no difference between these publication sets in their bibliometric citations and altmetric mentions, then our ability to quantitatively distinguish between different kinds of impacts is brought into doubt.

In practice, we compared three publication categories, not two, because the pool of submitted outputs and case study references overlap to a substantial degree. There were 120,784 journal papers in the REF2014 database of submitted research outputs (PRO) and 11,822 journal papers among the case study references (PCS) which we were able to match with citation data via their DOI. 5,703 papers were submitted in 2014 as both PROs and PCSs (PRO/PCS). Intriguingly, the overlap was lower in basic research areas than in applied research areas.

Our study examined convergent (and discriminant) validity: do indices of societal and academic impact vary in distinct ways between PCS and PRO articles? Do different approaches to societal impact (REF scores and altmetrics) create comparable measures of a common construct in case study data (if they measure the same construct and are convergently valid, then REF scores should correlate with altmetrics data).

We expected higher correlations (i) for PRO between REF output scores and citation impact and (ii) for PCS between REF impact scores (for case studies) and altmetrics. Lower correlations are expected for (i) REF output scores and altmetrics for PRO and (ii) REF impact scores (for case studies) and citation impact for PCS.

We found:

  • Average bibliometric citation impact is higher for PRO than for PCS.
  • Mentions of papers in policy-related documents (especially) and Wikipedia are significantly higher for PCS than for PRO; the result for news items is similar, though slightly smaller.
  • For Twitter counts, the PCS-PRO difference is close to zero, nor do counts correlate with citations: tweets do not appear to reflect any serious form of impact.
  • The highest scores, across all indicators, were associated with the PCS/PRO overlap. These publications were cited as frequently as the pure PRO set and scored higher than the pure PCS on altmetrics for every source.

We then correlated REF scores and metrics on the basis of UK research institutions, following the approach of comparing decisions in peer review with later citation impact (Bornmann, 2011). We found that REF scores on impact case studies correlated only weakly with altmetrics, thereby disqualifying arguments in favor of using altmetrics for societal or broader impact measurements. The weak relationship between peer assessments of societal impact and altmetrics data mirrors other studies (Thelwall & Kousha 2015) and questions any application of altmetrics in measuring societal impact in research evaluation. Whereas peers can acknowledge a successful link between research and societal impacts (based on descriptions in case studies), altmetrics do not seem to be able to reflect that. Altmetrics may instead demonstrate distinct public discussions around certain research topics (which can be visualized, see Haunschild, Leydesdorff, Bornmann, Hellsten, and Marx, 2019).

Perhaps the most interesting results here, are the relatively high scores – across the board – for publications that were submitted by individual researchers and then also used to support case studies. Some outputs have an evident capacity for impact, whether that is among other researchers, or in their potential for wider application. There is therefore no necessary gap between academic and societal value, a conclusion that has been known at least since Vannevar Bush’s seminal Science, the Endless Frontier. Societal value can be expected from research that follows high academic standards.


Lutz Bornmann (pictured) is a habilitated sociologist of science and works at the Division for Science and Innovation Studies of the Max Planck Society, He received the Derek de Solla Price Memorial Medal in 2019. Robin Haunschild is at the Max Planck Institute for Solid State Research, A chemist, his current research interests include the study of altmetrics and the application of bibliometrics to specific fields of natural sciences, e.g. climate change.

View all posts by Lutz Bornmann and Robin Haunschild

Related Articles

Survey Suggests University Researchers Feel Powerless to Take Climate Change Action
Impact
April 18, 2024

Survey Suggests University Researchers Feel Powerless to Take Climate Change Action

Read Now
Three Decades of Rural Health Research and a Bumper Crop of Insights from South Africa
Impact
March 27, 2024

Three Decades of Rural Health Research and a Bumper Crop of Insights from South Africa

Read Now
Using Translational Research as a Model for Long-Term Impact
Impact
March 21, 2024

Using Translational Research as a Model for Long-Term Impact

Read Now
Norman B. Anderson, 1955-2024: Pioneering Psychologist and First Director of OBSSR
Impact
March 4, 2024

Norman B. Anderson, 1955-2024: Pioneering Psychologist and First Director of OBSSR

Read Now
New Feminist Newsletter The Evidence Makes Research on Gender Inequality Widely Accessible

New Feminist Newsletter The Evidence Makes Research on Gender Inequality Widely Accessible

Gloria Media, with support from Sage, has launched The Evidence, a feminist newsletter that covers what you need to know about gender […]

Read Now
New Podcast Series Applies Social Science to Social Justice Issues

New Podcast Series Applies Social Science to Social Justice Issues

Sage (the parent of Social Science Space) and the Surviving Society podcast have launched a collaborative podcast series, Social Science for Social […]

Read Now
The Importance of Using Proper Research Citations to Encourage Trustworthy News Reporting

The Importance of Using Proper Research Citations to Encourage Trustworthy News Reporting

Based on a study of how research is cited in national and local media sources, Andy Tattersall shows how research is often poorly represented in the media and suggests better community standards around linking to original research could improve trust in mainstream media.

Read Now
0 0 votes
Article Rating
Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Inline Feedbacks
View all comments