Resources

Evaluation Implications of the Coronavirus Emergency

March 27, 2020 7040

I have been doing evaluations and writing about the evaluation profession for nearly 50 years. For the last decade, I’ve been writing about evaluation under conditions of complexity (Developmental Evaluation) and global systems transformations (Blue Marble Evaluation). I’ve been getting queries from colleagues young and old, novice evaluators and long-time practitioners, asking how I’m making sense of the global health emergency and what I think the implications may be for evaluation. For what it’s worth, here’s my take on where we are and what it means.

1. Adapt evaluation plans and designs now. All evaluators must now become developmental evaluators, capable of adapting to complex dynamics systems, preparing for the unknown, for uncertainties, turbulence, lack of control, nonlinearities, and for emergence of the unexpected. This is the current context around the world in general and this is the world in which evaluation will exist for the foreseeable future.

2. Be proactive. Don’t wait and don’t think this is going to pass quickly. Connect with those who have commissioned your evaluations, those stakeholders with whom you’re working to implement your evaluations, and those to whom you expect to be reporting and start making adjustments and contingency plans.  Don’t wait for them to contact you. Evaluation is the last thing on the minds of people who aren’t evaluators. They won’t be thinking about how the crisis affects evaluations. That’s your job as an evaluator. Get to work doing that job. Adjustments need to be made now, sooner rather than later. Offer help in updating your evaluation. This doesn’t necessarily mean delaying data collection. It may mean accelerating it to get up-to-date information about the effects of the crisis. For example, a planned survey of parent involvement in schools becomes a quick survey about how families are adjusting to school closures.

Simon Kneebone, asks what's changed
Illustrations are by Australian Simon Kneebone, Australian Evaluation Society 2017 conference on Evaluating Transformation and from the book Blue Marble Evaluation.

3. Make it about use not about you. The job of the people you work with is not to comfort you or help you as an evaluator. Your job is to help them, to let them know that you are prepared to be agile and responsive, and you do so by adapting your evaluation to these changed conditions. This may include pushing to keep evaluations from being neglected or abandoned by showing the ongoing relevance of evaluative thinking and findings – which means adapting to ensure the ongoing relevance and utility of evaluative thinking and findings.  For example, in an international project with many field locations, instead of continuing to administer a routine quartering monitoring survey, to be more useful we’ve created a short open-ended survey about how people are being affected and adapting at the local level, and what they need now from headquarters.    

This post from Michael Quinn Patton originally appeared at the Blue Marble Evaluation blog and is reposted with their permission. Blue Marble Evaluation focuses on integrating design, engagement, implementation, and evaluation of programs and interventions of all kinds, especially initiatives working on making global systems more sustainable.

4. Real-time data rules. Channel your sense of urgency into thinking pragmatically and creatively about what data you can gather quickly and provide to your evaluation users to help them know what’s happening, what’s emerging, how needs are changing, and consider options going forward. At the same time help them document the changes in implementation they are making as a result of the crisis — and the implications and results of those changes.  You may be able to gather data and provide feedback about perceptions of the crisis and its implications, finding out how much those affected are on the same page in terms of message and response. That’s what developmental evaluators do. (See #1 above.)  

5. Consider the “good enough” standard of rigor. Detach from rigor as an absolute methodological standard. Rigor serves use and puts us solidly in the situation of doing what pioneering evaluator Peter Rossi called “good enough” evaluations. Decisions are being made quickly. Some data to support those decisions when they are made is better than data that are too little and too late. This places “rigor” in the context of crisis conditions, acknowledging uncertainty, emergence and urgency.  For example, a smaller purposeful sample of interviews with a few diverse program staff and participants may be done more quickly, and be more useful, than developing and administering a full survey.

6Everything changes in a crisis. Embrace change, don’t resist it. Program goals may appropriately change. Measures of effectiveness may change. Target populations may change. Implementation protocols may change. Outcome measures may change. This means that evaluation designs, data collection, reporting timelines, and criteria will and should change. Intended uses and even intended users may change. Expect change. Facilitate change. Document changes and their implications. That’s your job in a crisis, not to go on in a comfortable business-as-usual mindset. There is no business-as-usual now. And if you don’t see program adaptation, consider pushing for it by presenting options and introducing scenario thinking at a program level. Take risks, as appropriate, in dealing with and helping others deal with what’s unfolding.

7. Engage in systems thinking. If you have been putting off bringing systems thinking to your evaluations, now is the time. If you’ve already been bringing systems thinking to your work, now is the time to go deeper and demonstrate to those you work with the relevance and importance of thinking systemically about what is happening. Public health, community health, national health, global health, your family’s health, and your personal health are all connected. This is micro to macro, and macro to micro, systems thinking. The state of public health is connected to the economy, the financial system, politics at every level, social well-being, cultural perspectives, educational inequities, social and economic disparities, public policy decisions, and evaluation. Practice seeing the interconnections and their implications for your work, your evaluations, and your life. Celebrate the initiatives of young people worldwide to build a more sustainable and equitable future.

8. Think globally, act locally. Zoom out to understand the big picture of what’s happening globally and zoom in to the implications locally, where locally means whatever level you’re working at. We all know that context matters for every evaluation, but what is involved in contextual assessment has now expanded to a global level. Use this crisis to hone your evaluative thinking skills to understand how global patterns and trends intersect with and affect what happens locally, including in your own evaluations at whatever level you are operating. Context matters. The whole world is now part of our evaluation context. The Global South and Global North will be intertwined as the global health emergency deepens and broadens.

9. Prepare to make the case for evaluation’s value. Expect proposals to cutback evaluation funding.  Reduced evaluation demand and declining evaluation resources will flow from this crisis. Evaluation budgets, units, and personnel have always been vulnerable in times of crisis. One of the first targets for budget cuts in recessions and political turmoil have historically been evaluations. The economic ripple effects won’t show up for a few months, but they will manifest, be sure of that. Economic recession is a certainty. The way that government money is currently being poured into the economy is not sustainable and will lead to major cutbacks down the road. Those cutbacks may target and hit evaluation hard. Prepare by working now to make evaluation all the more useful and real-time data essential so that the evaluation value proposition reframes evaluation as an essential activity not as a mundane bureaucratic or luxurious function when times are good. Define, conceptualize, articulate, and demonstrate the essential utility of evaluation. Lay the groundwork now, not when the cutbacks are announced. Your future as an evaluator and the future of the evaluation profession depends now more than ever on demonstrating evaluation’s cost-beneficial utility: our capability of demonstrating value for money based on evaluation use and in the process-use of applying adaptive, and crisis-informed inquiry frameworks and complexity-informed recommendations.

10. Be knowledgeable, be a fact checker.  Trustworthy, valid, and useful information is at a premium in crisis. Misinformation, bad data, truly fake news, false rumors, distorted statistics, lies, and well-meaning but wrong interpretations are rampant, dangerous, and can cost lives. We are all, as evaluators, knowledge workers. We bear societal responsibility to serve the public good. This means staying informed, being a fact checker, and helping ensure that facts trump ideology and politics. This goes beyond conducting any singular evaluation to our collective responsibility to society as knowledge workers and evaluation scientists. Play that role and identify yourself as an evaluator when doing so. After a speech I gave as the Coronavirus was first emerging, the sound technician came up to me and asked if I knew that the virus came from a Chinese secret chemical weapons facility built on an ancient civilization where they were mining vicious microbes. I invited him to sit with me for a minute and took him to a site that debunked that story to his satisfaction.

11. Model systematic evaluative thinking. The media are filled to overflowing with opinions about what’s working and not working, what’s been done well and poorly, and who’s to blame and who gets credit. Everyone is an evaluator. But we are professional, systematic evaluators. Evaluate for yourself — with skill, care, and thoughtfulness — what’s working and not working to mitigate the crisis.Be prepared to render judgments as appropriate based on cumulative evidence, but also be prepared to demonstrate evaluative thinking when evidence is inadequate, when judgments are premature, and when the facts are uncertain. Refrain from expressing uninformed or premature judgments, and urge others to do likewise.   

12. Advocate for better data. Reports of the incidence and prevalence of the coronavirus appear to be problematic in many cases. Ongoing systematic stratified random sample testing will be needed to establish population infection rates.  Understand the strengths and weaknesses of the epidemiological statistics. Here’s one example of the statistical debate (read both the article and comments.)

13. Spotlight the need for global, longer term sustainability transformation. The global health emergency is a short-term crisis within the larger and longer-term global climate emergency. This health crisis has revealed both the importance and possibility of systems transformation. This crisis illuminates the scale, scope, and urgency of systems transformations needed worldwide to create a more sustainable and equitable future. This pandemic is reflecting the fragmented and fragile nature of current systems, inadequate for a just and equitable world. As your work adapts to the current reality, think about how you can bring this larger perspective to bear in your work, to be attentive to gather evidence for, and support the kinds of transformations that may be needed after the pandemic subsides. Balancing long-term threats to the future of humanity with the urgent demands of short-term, crisis-generated interventions demands in-depth transformative evaluative thinking. Evaluators need to be prepared to contribute to finding and following pathways and trajectories toward transformations for a more sustainable future.  

14. Keep learning.  Stay on top of developments in diverse fields that can inform theories of change and transformation. And return to classic sources of knowledge and wisdom. My own source of distraction and renewal is philosophy of science. I offer these reflections from a stance of what philosophers of science call epistemic humility. If social distancing and/or quarantine has given you time for philosophic reflection, check out perspectives on epistemic humility, knowledge, wisdom, and rationality at the Stanford Encyclopedia of Philosophy.

15. Support each other as an evaluation community. Buddy up. Stay connected to other evaluators. Participate actively in our professional networks and associations. We are a global community. Despite differences in approaches, varied methodological preferences, diverse inquiry priorities, and debates about preferred criteria and standards, we share a commitment to evidence-based decision-making, evaluative thinking, and using evaluation to make a better world. That’s our niche, and it’s a critical one. Tough times are ahead. Let’s stay connected. Let’s support each other. What we do matters. Stay healthy. Stay strong. Stay sane. Stay active in our global evaluation community. Think about what contributions you can make, as an evaluator, to mitigate the crisis. And, especially, support EvalYouth, the future of our profession.

Acknowledgements

My thanks to Blue Marble Evaluation team members and colleagues Lija Greenseid, Glenn Page, Pablo Vidueira, and Charmagne Campbell-Patton for feedback and suggestions on a draft version of these reflections.

Michael Quinn Patton is an independent consultant with more than 40 years’ experience conducting applied research and program evaluations. He was on the faculty of the University of Minnesota for 18 years, including five years as director of the Minnesota Center for Social Research, where he was awarded the Morse-Amoco Award for innovative teaching. He has authored a number of books on evaluation, many for SAGE Publishing (the parent of Social Science Space) and his recent Blue Marble Evaluation for Guilford. Quinn Patton is a former president of the American Evaluation Association and recipient of both the Alva and Gunnar Myrdal Award for Outstanding Contributions to Useful and Practical Evaluation and the Paul F. Lazarsfeld Award for Lifelong Contributions to Evaluation Theory from the American Evaluation Association. The Society for Applied Sociology presented him the Lester F. Ward Award for Outstanding Contributions to Applied Sociology.

View all posts by Michael Quinn Patton

Related Articles

Apply for Sage’s 2024 Concept Grants
Announcements
March 7, 2024

Apply for Sage’s 2024 Concept Grants

Read Now
New Podcast Series Applies Social Science to Social Justice Issues
Impact
February 28, 2024

New Podcast Series Applies Social Science to Social Justice Issues

Read Now
New Dataset Collects Instances of ‘Contentious Politics’ Around the World
Resources
December 13, 2023

New Dataset Collects Instances of ‘Contentious Politics’ Around the World

Read Now
Fake News, Misinformation Focus of New Microsite
Resources
October 30, 2023

Fake News, Misinformation Focus of New Microsite

Read Now
Is Wikipedia A Good Academic Resource?

Is Wikipedia A Good Academic Resource?

As research and instruction librarians, we know people have concerns about using Wikipedia in academic work. And yet, in interacting with undergraduate and graduate students doing various kinds of research, we also see how Wikipedia can be an important source for background information, topic development and locating further information.

Read Now
Database Tracks Attempts To Ban Critical Race Theory

Database Tracks Attempts To Ban Critical Race Theory

Few topics in education have dominated the news over the past few years as much as efforts to ban critical race theory from the nation’s schools. The topic is so pervasive that researchers at the UCLA School of Law Critical Race Studies Program have created a new database to track attempts by local and state government to outlaw the teaching of the theory, which holds, among other things, that racism is not just expressed on an individual level, but rather is deeply embedded in the nation’s laws and policies. The Conversation asked Taifha Natalee Alexander, director and supervisor of the database, about the overarching purpose of the database and what it has shown thus far.

Read Now
Celebrating National Arab American Heritage Month

Celebrating National Arab American Heritage Month

In 2017, the Arab America Foundation and its sister organization Arab American launched an initiative to create a National Arab American Heritage […]

Read Now
0 0 votes
Article Rating
Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Inline Feedbacks
View all comments