The Dutch Senate last year passed a new Standard Evaluation Protocol that highlights the importance of social impact for research. The new protocol was developed by the Royal Netherlands Academy of Arts and Sciences, the Association of Dutch Universities and Dutch Science Council and will be used to evaluate academic research from 2015-2021.
The Dutch Standard Evaluation Protocol (SEP) adds new elements to well established research evaluation criteria. As before, excellent science will continue to be scrutinized as to its scientific merit and influence, if only because high impact is a disaster if the underlying research is bad. But academic quality (and its twin, academic productivity) is no longer the ultimate factor to be considered. Commercial valorization of knowledge and societal impact were added to the list of areas in which scientists must demonstrate their contribution.
This is an important change in particular for our own field, development studies. Development studies aims at impacting society, in particular the transformation of society and the economy to the benefit of vulnerable groups. Activist-type research that is characteristic of many of the knowledge building activities in development studies, involves two-way communication with societal movements, local actors and other stakeholders. For a long time researchers of development policies have seen social impact as an ethical and moral responsibility of scientific research towards societies that increasingly realize that science can have positive and negative consequences on everyday life. Therefore ‘science with’ and ‘science for’ society is not just an option – it is a necessity in development studies.
The SEP’s focus on societal impact would thus seem to be welcome, especially since development studies as a multidisciplinary field of study is often seen to have a competitive disadvantage vis-à-vis the mono-disciplinary approaches. The SEP, however, leaves the question of how to measure and/or report social impact unanswered thereby challenging the academic community to develop methodologies that demonstrate impact. This is a challenge indeed, because ‘societal impact’ is not always, or indeed often, self-evident and hence the topic needs much more attention from researchers. This is a sobering observation, because the prevailing assumption to date has been that societal impact is evident, as if it were part of the DNA of those involved in development economics and other branches of the multi-disciplinary field of Development Studies.
Impact where, for what and on whom?
Western societies demand greater transparency about science’s impact on and contribution to society. Politicians want to be able to check if tax money is spent where its contribution to society is largest. The field of development studies, however, deals with non-Western countries and often with truly global issues. Therefore impact at the level of national units is often difficult to demonstrate. Moreover, development is a long-term process and this raises the question of how to account for impacts that may very well only be reaped by future generations. Therefore it is important that education (both at PhD level and at master’s level) and capacity development form bridges between science and society. Development Studies, moreover, have to deal with specific ethical and moral dilemmas, in particular when vulnerable groups are involved. Data cannot always be made available because it may actually hurt the position of those being studied.
In the same vein, full transparency and open access may make it difficult to engage non-government actors, including the private sector, and may very well invoke negative reactions from those in power. Relatedly, it is comparatively difficult to conceive truly neutral and value-free research if deprived groups are involved. It is vital not to lose sight of the ‘do no harm’ principle.
Social impact @ sciences: The end of the ivory tower? provides perspectives on the impact of social sciences from Management Sciences, Commercial Valorization, Development Studies, Research Quality by Huibert Pols (Rector Magnificus Erasmus University), Jack Spaapen (KNAW Royal Netherlands Academy of Arts and Sciences), Eric Claassen (Erasmus Valorisation Center), Ann Buchanan (ESRC, Oxford University) and Wilfred Mijnhardt (Rotterdam School of Management and Erasmus Research Institute for Management).
Also, see the video below.
Some of these aspects of course relate to the nature of research in the social sciences itself: social scientists often have to deal with specific and sensitive types of data, have limited budgets available for impact assessments to track (long run) social impact and are confronted with the ambiguity of whose intervention actually produced a particular impact on a particular situation, when many stakeholders from diverse sectors are involved (so that the question arises to whom to give credit for a certain policy outcome when the research has been carried out with a variety of stakeholders including government officials, NGOs and private sector representatives.) In view of this, it is necessary to reconsider artificial dichotomies (excellence/relevance, society/market, keep control/give control),
Beyond conventional boundaries
Interdisciplinary research is not only essential if societal and policy-relevant research is to be produced; mono-disciplinary research often does not make sense for studies that analyze impact of interventions in developing countries and emerging markets. Indeed, the social impacts of research cross the conventional academic boundaries and measurements. One important implication is that evaluations that deploy conventional publication-centric assessments to trace impact and relevance are inadequate. Indeed, the current established reward system in academe/universities also plays a role in creating impediments to the maximizing of social relevance of scientific research, as well as of the researcher. This is particularly noted in the field of development studies, where the candidates who, so to say, tick the boxes of the traditional evaluation criteria, are often favored and rewarded.
For a long time, specialization, according to many academics, was seen as the royal road to more and better knowledge. Policy makers, however, insist that research that is mono-disciplinary is not useful for designing and evaluating policy interventions. The creation of mixed research teams (academics and others) to maximize impact, creates new challenges. Crossing scientific boundaries and aiming at the creation of societal impact implies that the profiling of research needs a great deal of attention.
The issue is urgent, especially in situations when scientific research is undertaken for the purpose of targeting policy-makers and bringing about a change of policy for the benefit of society. Working with journalists in order to reach the general public becomes essential. Sharing and translating conventional academic outputs into more appealing forms and shaping pieces of information for the use of policy makers, needs to be included in research plans. Reaching the public at large involves the use of social media, mixed publications (that is combination(s) of academic outputs, media briefs, policy papers) and the need to take part in public debates.
In order to maximize impact, teaching projects that help to actually implement evidence-based policies in government, NGOs and the private sector may be an important instrument to translate curiosity-driven science so as to facilitate knowledge creation.
In a nut-shell, the following best practices may help to shape a research agenda that creates top-notch knowledge and social impact:
- Formulate research agendas that include not only academics but also other stakeholders from outside of academia.
- Formulate a diversified publication strategy that reaches a number of different audiences
- Formulate a diversified research funding strategy
- Participate in public debates as a way to improve research quality and relevance.
- Redefine and refine the existing reward criteria, based on the needs in the wider-field