Communication

How Concerned Should We Be About Gaming Altmetrics?

July 11, 2014 1420

street chess

So what’s the next move in creating genuinely useful metrics for determining academic value? (Photo: Jimmy Baikovicius/Flickr)

Scholarly communication is the process that starts with a research idea and ends with a formal scientific publication that through peer review is accepted as part of scientific knowledge. The publication is then used (or not) by other researchers who acknowledge the research by referencing it in new scientific publications. The references given and citations received are the links between earlier research and current research. They are also part of the scientific award system that stems from the assumption that more valuable research receives more citations, and hence also more cited researchers tend to be more highly regarded.

But on the web and especially in social media scientific output and scientific products can be acknowledged in many other ways than through formal citations; scientific articles can be mentioned on Twitter, liked on Facebook, bookmarked on Mendeley, or commented in blogs. It is also worth mentioning that it is not only scientific articles that can gain visibility in social media, as other research products such as datasets and software can also be shared, commented and referenced in social media.

This article by Lars Kullman originally appeared on <a href=

the LSE Impact of Social Sciences blog as “Altmetrics may be able to help in evaluating societal reach, but research significance must be peer reviewed” and is reposted under the Creative Commons license (CC BY 3.0).” width=”300″ height=”94″ class=”size-medium wp-image-10046″ /> This article by Kim Holmberg originally appeared on the LSE Impact of Social Sciences blog as “Across all fields, Open Access articles in Swedish repository have a higher citation rate than non-OA articles” and is reposted under the Creative Commons license (CC BY 3.0).

These traces of scholarly communication in social media can be thought of as acts that acknowledge the value of the research or a researcher’s work, or at least as indicators of awareness and attention. This idea has led many researchers to investigate whether these traces in social media could in fact be measured and used to evaluate the impact, visibility or even quality of research. These efforts go often under the title of altmetrics, or alternative metrics, although there seems to be consensus in the research community about the fact that these metrics are not alternatives to traditional citation based impact measures, but rather complementing to them.

In fact, an increasing body of evidence suggests that altmetrics may indicate another type of impact, namely the attention from the general public.

Today altmetrics is used to refer to a wide range of different social media metrics, although only few of them have shown some potential as truly valuable indicators of visibility, attention or use. Earlier research have for instance shown that only Twitter have relatively good coverage of scientific content (Thelwall et al., 2013), or at least that Twitter allows researchers to access their data (although in a limited way). Facebook would certainly be a rich source for altmetrics, however only Facebook’s own researchers have access to the wealth of data that is being collected from the users. The limited data access is troublesome as it hinders one of the fundamental requirements of research, namely reproducibility. If other researchers cannot have access to the data and repeat a research to verify its results, can it be called research?

It is probably not correct to bundle all possible social media metrics under the same umbrella, as explicit evidence of their value as indicators of scholarly activities is still lacking. Having said that, some social media based indicators of scientific activities and scientific products have exciting potential that needs to be researched further to fully understand their value and possible applications. More research is also needed before we can even talk about using altmetrics for research funding decision or for decisions about tenure or promotions. Altmetrics can however be used, at least to some extent, for two purposes: 1) altmetrics can help researchers to locate popular and/or interesting research, and 2) altmetrics can in some cases tell something about the attention from the general public.

As the growth rate of scientific publication is increasing it has become impossible for researchers to keep track of everything that is being published within their subject area. For instance, based on data from SCImago Journal and Country Rank there were almost 18,000 publications within the subject area of sociology and political science alone in the year of 2012.

It is therefore crucial for researchers to be able to find the most relevant and most valuable research for them. Locating what is interesting (and perhaps to some degree valuable) can be aided by social media metrics. Frequent retweeting for instance can point to popular content or it may be valuable to learn what other researchers are reading on Mendeley. Numbers generated from this kind of attention can give article level insights into what is interesting or what other researchers have found to be valuable. In fact, many publishers are attaching altmetric indicators to their online publications partly to help researchers locate possibly interesting research, but also for the authors to learn about the attention their research has received.

Universities have traditionally had three missions; 1) education, 2) research, and 3) societal impact or public outreach. Both education and research have already for a long time been evaluated and measured, but measuring the impact of public outreach has been more difficult. Some of the current research results suggest that it is precisely this kind of impact and attention that some of the altmetrics are measuring. It has for instance been suggested that scientific articles that receive most attention in social media do so not because they would necessarily be of higher quality and not because it would mainly be other researchers that create the attention, but rather because of their seemingly humorous titles that have caught the attention of the general public (Haustein et al., 2013). Whether that is the kind of societal impact that research should aim for can be debated, however, this third mission of universities is of increasing importance. In for instance the next Research Excellence Framework (REF) in the UK 20 percent of the evaluation will be based on ‘reach and significance.’ Altmetrics may be able to help in evaluating societal reach, but the significance of it must be peer reviewed.

If altmetrics would be used for funding decisions, the questions about the possible unintentional and intentional gaming of the numbers would have to be investigated. What is unintentional gaming of altmetrics and where do we draw the line? If I tweet about my newly published article once, am I gaming the numbers? What if I tweet about it, mention it on Facebook, LinkedIn, and my blog? What if I ask my followers on Twitter to retweet it? What if I tweet about it twice, or five times, or 10 times? What if I create a bot that tweets about the article 10,000 times? Where do we draw the line between promoting your own work and gaming altmetrics? Unintentional gaming of altmetrics is already happening as there are some bots or feeds on Twitter that tweet everything that is published in a certain journal or preprint archive. Some of these bots select what to tweet from RSS feeds based on certain keywords, others tweet everything. So far, these are probably meant to help researchers to find interesting articles, but unintentionally they are artificially inflating the numbers of the articles. As it is relatively easy to create bots or other automated scripts to inflate the numbers, there is a real danger that the minute altmetrics would be used for funding decisions or for decisions about promotions at least some researchers would start to game the numbers. To the best of my knowledge, to date there is no research about the current extent of gaming, whether it being intentional or unintentional. Nor is there research about how to detect gaming of altmetrics.

Social media indicators of scholarly communication, the so called altmetrics, are still far away from being adopted as part of everyday evaluation metrics of scientific articles and other scientific products, but they already have some other value in indicating what is interesting and popular. Future research will tell whether altmetrics can be used for something else.


Kim Holmberg is a postdoctoral research associate at VU University Amsterdam, Department of Organization Sciences, and he is affiliated to University of Wolverhampton, UK, and Åbo Akademi University, Finland. His research interests include webometrics, scientometrics, altmetrics, social media, and social network analysis. Holmberg’s current work includes researching the meaning and validity of altmetrics and mapping climate change communication in social media.

View all posts by Kim Holmberg

Related Articles

2024 Holberg Prize Goes to Political Theorist Achille Mbembe
News
March 14, 2024

2024 Holberg Prize Goes to Political Theorist Achille Mbembe

Read Now
AAPSS Names Eight as 2024 Fellows
Announcements
March 13, 2024

AAPSS Names Eight as 2024 Fellows

Read Now
Britain’s Academy of Social Sciences Names Spring 2024 Fellows
Recognition
March 11, 2024

Britain’s Academy of Social Sciences Names Spring 2024 Fellows

Read Now
Did the Mainstream Make the Far-Right Mainstream?
Communication
February 27, 2024

Did the Mainstream Make the Far-Right Mainstream?

Read Now
The Use of Bad Data Reveals a Need for Retraction in Governmental Data Bases

The Use of Bad Data Reveals a Need for Retraction in Governmental Data Bases

Retractions are generally framed as a negative: as science not working properly, as an embarrassment for the institutions involved, or as a flaw in the peer review process. They can be all those things. But they can also be part of a story of science working the right way: finding and correcting errors, and publicly acknowledging when information turns out to be incorrect.

Read Now
Safiya Noble on Search Engines

Safiya Noble on Search Engines

In an age where things like facial recognition or financial software algorithms are shown to uncannily reproduce the prejudices of their creators, this was much less obvious earlier in the century, when researchers like Safiya Umoja Noble were dissecting search engine results and revealing the sometimes appalling material they were highlighting.

Read Now
Did Turing Miss the Point? Should He Have Thought of the Limerick Test?

Did Turing Miss the Point? Should He Have Thought of the Limerick Test?

David Canter is horrified by the power of readily available large language technology.

Read Now
0 0 votes
Article Rating
Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Inline Feedbacks
View all comments