A year or two after the Berlin Wall came down I was taken to Moscow by police colleagues to discuss the research that was developing there into criminal behaviour. To my surprise I was told that they had a special government unit whose job was to convert research into policy and practice. They did not think researchers could necessarily understand the implications of their own findings. This translation was a special skill that needed to be carried out by those not involved in the original research. I think this idea had its roots in the Marxist distinction between interpretation and actually changing the world – praxis.
A quarter of a century on the Russian distinction between theory and praxis has still not taken hold anywhere else that I know of. Indeed there is growing pressure on academic researchers themselves to ensure that their studies have impact. It is becoming a mantra to demonstrate that any social science research will make a difference; in Marx’s aphorism ‘to change the world’. Demonstrating such effects is now an integral part of the assessment of research in the UK and many other countries. It is also a common requirement to obtain research funding.
This pressure, though, does make me wonder how the impact of the enormously expensive radio telescopes or particle accelerators and other aspects of big science are argued. They seem to be encouraged to demonstrate a contribution to greater understanding of the universe. The social sciences by contrast do not seem to be encouraged to provide a greater appreciation of the nature of criminality, or social mobility, or financial exchange, the development of cultures, or many other fundamental social questions, unless the direct consequences of such elucidation can be demonstrated for government policies, practical interventions or economic improvements. I wonder if this is part of a general view in the administrative bureaucracies that social research does not deal with anything fundamental so if it is to be supported at all it has to earn its keep, more or less literally.
The impact imperative has inevitably become a topic of research in its own right, its aim being, of course, to increase the uptake of academic research. The wide range of such research is unpacked in the recent bumper issue of the journal of Contemporary Social Science, Volume 8 Number 3, guest edited by Jon Bannister and Irene Hardill. They have brought together fourteen papers that explore many different aspects of how social science knowledge can be mobilised and how those who can use it may be encouraged to engage with it. In general, these papers are tremendously optimistic about the potential for social science to make a real contribution. In fact the editors go even further than this by stating that most of the challenges that concern the public and government need good social science research evidence to be overcome.
This is all at one with the laudable mission of the Academy of Social Sciences to ‘make the case for the social sciences’. The Academy’s series of ‘Making the Case’ booklets covering everything from sustainability to crime certainly show that there are plenty of interesting examples of the social value of social science. But what is emerging from careful examination of the processes of knowledge mobilisation, as it is now being called, is that the pathways to ensuring that social science has an impact are paved with many hazards. As the papers in the special issue make clear, the path is neither straightforward, nor without possible negative consequences. Nor does it move only in one direction. Once social scientists embrace the potential users of their work, or ‘dance with new partners’ as Bannister and Hardill put it, the very nature of what it means to be scientific can become confused.
What is needed to increase knowledge mobilisation may also challenge the ethics, independence, integrity, rigour and objectivity of the social science research carried out with practical uptake in mind. The drive for impact may also be marginalising more fundamental social science research, even though that fundamental research in big science gets it onto prime-time television. Science is about going beyond the data, as many famous scientists have noted. In other words, of developing ways of encapsulating aspects of the world that are relevant beyond any particular study. That sort of theory development is much more difficult if policy makers and practitioners are looking for answers to their specific problems.
The challenge of doing good social science in a way that encourages the uptake of its results is made clear in the paper by Jon Bannister and Anthony O’Sullivan in the special issue. They show that what policy makers respond to is a good story illustrated by a telling case study; the very antithesis of social science academic practice, with its caveats and statistics. But Bannister and O’Sullivan are correct. I know from my own study, for example, of fire regulations that many are derived from what happened in a particular, dramatic incident rather than detailed studies over many examples as would be the case for scientific research. This is also shown by how much government policy is still a response to particular events, despite the claim that it is evidence led.
Paradoxically, the most appealing narrative that is gaining ground, especially in relation to what is called ‘evidence led’ policy, notably currently in policing, is the randomised controlled trial. This is grasped with all the gullibility of accepting a good story. In the urge to get police forces to develop their strategies on the basis of evidence the word is out that the only good evidence is the result of a randomised control trial. These trials, in effect, generate a tidy narrative that can be encapsulated in a neat example on the lines of the police did this and this is what happened, but when they didn’t it didn’t happen. But this ignores the very limited applicability of the randomised control trial to many real-world problems, especially in the area of criminality. As many have pointed out, such controlled trials can only be carried out usefully when conditions during the trial stay constant and there are no ethical limitations of what relevant experimental conditions can be introduced. Furthermore, the results only make sense if distinct interventions can be clearly isolated, defined and introduced and the consequence of those interventions are linear, so that extrapolation from the inevitably limited experimental conditions are valid. Most problems facing society are not so tidy.
To follow up the dance metaphor, if the people who pay for research increasingly call the tune the music can get rather limited and monotonous. Hopefully this will make all aware that there are times when the dancing has to stop so that the partners can get their breath back and learn new dances.