Impact

In Research, Engagement Is Not the Same As Impact

April 12, 2016 1605

tape measure for news pieceIn Australia’s Innovation Statement late last year, the federal government indicated a strong belief that more collaboration should occur between industry and university researchers.

At the same time, government, education and industry groupings have made numerous recommendations for the “impact” of university research to be assessed alongside or in addition to the existing assessment of the quality of research.

How should we measure research?

But what should we measure and, more importantly, why should we measure it?

The Conversation logo

This article by Stephen Taylor originally appeared at The Conversation, a Social Science Space partner site, under the title “When measuring research, we must remember that ‘engagement’ and ‘impact’ are not the same thing”

In accounting, we stress that the measurement basis of something inevitably reflects the purpose for which that measure is to be used.

So what is the purpose of measuring engagement, impact or, for that matter, quality?

The primary reason for measuring quality seems fairly self-evident – as a major stakeholder in terms of funding (especially dedicated research-only funding), the government wants an assessment of just “how good” by academic standards such research really is.

Looking ahead, measures of quality such as the Excellence in Research for Australia (ERA) rankings have been speculated to potentially influence future funding via prestigious competitive schemes (such as the Australian Research Council), block funding for infrastructure and the availability of government support for doctoral students via Australian Postgraduate Awards.

So the demand for a measure of research quality and the potential uses of such a measure are pretty clear.

But what valid reasons are there for investing significant resources in the measurement of research impact or engagement?

If high-quality research addresses important practical problems (large or small), surely we would expect impact would follow?

In this sense, the extent of impact is really a joint product of the quality (or robustness) of research and the choice of topic (i.e., practical versus more esoteric).

Research impact needs time

But over what period should impact be measured?

Recent exercises such as that conducted by the Australian Technology Network and Group of Eight have a relatively short-term focus, as would any “impact assessment” tied to the corresponding period covered by the existing ERA time frame (say the last six years).

I and many others maintain that impact can only be assessed over much longer periods, and that in many cases short-term impact is potentially misleading.

How often have supposedly impactful results subsequently been rejected or overturned?

Such examples inevitably turn out to reflect low quality (and in some cases outright fraudulent) research.

Ranking impact

Finally, how can impact be ranked? Is there a viable measure that can distinguish between high and low impact? Existing case-study approaches are unlikely to yield any form of quantifiable measurement of research impact.

Equally puzzling is the call to measure research engagement. What is the purpose of such an exercise? Surely in a financially constrained research environment, universities readily recognize the importance of such engagement and pursue it constantly.

We don’t need a national assessment of engagement to encourage universities to engage.

Motive aside, one approach canvassed is the quantum of non-government investment in research (i.e., non-government research income).

This is arguably one rather limited way to measure engagement, and is focused on input rather than output. If the purpose of any measurement is to capture outcomes, does it make sense to focus exclusively on inputs? The logic of this escapes me.

Engagement and impact are not the same thing

Even more worryingly, some use the terms engagement and impact interchangeably.

They would have us believe that a simple (but useful) measure of impact is the extent to which university researchers receive industry funding. Surely this is, at best, a measure of engagement, not impact.

Although the two are likely correlated, the extent will vary greatly across discipline areas.

Further, in business disciplines, much of the “knowledge transfer” that occurs via education (including areas such as executive programs) reflects the impact of the constant process of researching better business practices across areas such as accounting, finance, economics, marketing and so on.

Discretionary expenditure on such programs by business is surely an indication of the extent to which business schools and industry are engaged, yet this would be ignored if we focused on research income alone.
We must not lose sight that quality (i.e., rigor and innovativeness) is a necessary but not sufficient condition for broader research impact.

Engagement is not impact, and simple measures such as non-government research income tell us very little about genuine external engagement between universities and industry.

As accountants know, performance measurement reflects its purpose. What we need before any further national assessment of attributes such as impact or engagement is clear understanding of the purpose of such an exercise.

Only when the purpose is clearly specified can we have a sensible debate about measurement principles.The Conversation


Sage, the parent of Social Science Space, is a global academic publisher of books, journals, and library resources with a growing range of technologies to enable discovery, access, and engagement. Believing that research and education are critical in shaping society, 24-year-old Sara Miller McCune founded Sage in 1965. Today, we are controlled by a group of trustees charged with maintaining our independence and mission indefinitely. 

View all posts by Sage

Related Articles

Outstanding Social and Behavioral Scientists Sought for Sage-CASBS Award
Recognition
October 20, 2025

Outstanding Social and Behavioral Scientists Sought for Sage-CASBS Award

Read Now
Share Your Most Surprising Policy Citation for Chance to Win $500 [Closed]
Announcements
October 17, 2025

Share Your Most Surprising Policy Citation for Chance to Win $500 [Closed]

Read Now
We See Economic Growth Differently Thanks to the 2025 Nobelists in Economics
Impact
October 14, 2025

We See Economic Growth Differently Thanks to the 2025 Nobelists in Economics

Read Now
Popular Paper Examines Ensuring Trustworthiness in Qualitative Analysis
Impact
July 10, 2025

Popular Paper Examines Ensuring Trustworthiness in Qualitative Analysis

Read Now
Examining How Open Research Affects Vulnerable Participants

Examining How Open Research Affects Vulnerable Participants

Open research has become a buzzword in university research, but Jo Hemlatha and Thomas Graves argue that when it comes to qualitative research, considerations around replicability, context-dependent methods and the sensitivity of data from marginalized people mean that openness takes many different forms.

Read Now
Valentin-Yves Mudimbe, 1941-2025: The Philosopher on the ‘Invention’ of Africa

Valentin-Yves Mudimbe, 1941-2025: The Philosopher on the ‘Invention’ of Africa

Congolese thinker, philosopher and linguist Valentin-Yves Mudimbe died on April 21, 2025 at the age of 83. He was in the US, […]

Read Now
Christopher Jencks, 1936-2025: An Innovative Voice on Inequality

Christopher Jencks, 1936-2025: An Innovative Voice on Inequality

Christopher Jencks, known for his novel and inventive opinions on hot topic issues like income inequality, homelessness, and racial gaps in standardized […]

Read Now
0 0 votes
Article Rating
Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Newest
Oldest Most Voted
Inline Feedbacks
View all comments