How do I monitor, evaluate and learn about policy engagement?

These pages contain guidance notes, tools and other resources, primarily for researchers and research support staff, to help them monitor and evaluate engagement with public policy, and learn from it. They have been prepared and collated in response to challenges identified by researchers and support staff in a wide range of academic disciplines, and are informed by their insights.

There are three guidance notes, as follows:

  • Guidance Note 1 suggests some principles for researchers and others looking to learn about and improve their engagement with the policymaking community.
  • Guidance Note 2 delves deeper into these principles, and highlights tools that can be used from the outset. These and other tools can also be found in the resource library below.
  • Guidance Note 3 explores the strategic importance of these activities for individual researchers and the University of Oxford, offering recommendations for how to embed these activities within research support structures at different levels. These may also be of interest to other universities, research networks and funding organisations.

NEW! Check out Overton in the Resource Library below, in the "Tracking outcomes and impacts" section

Resource library

Tools

Start early

Tilley, H., Shaxson, L., Rea, J., Ball, L., Young, J. (2017). 10 things to know about how to influence policy with research. London: Overseas Development Institute. 

Young, J., Shaxson, L., Jones, H., Hearn, S., Datta, A., Cassidy, C. (2014). ROMA: a guide to policy engagement and influence. London: Overseas Development Institute. 

Recognising complexity, context

Stachowiak, S. (2013). Pathways for Change: 10 Theories to Inform Advocacy and Policy Change Efforts. Washington, DC: Center for Evaluation Innovation. 

Valters, C. (2015). Four principles for Theories of Change in global development. London: Overseas Development Institute. 

Vogel, I. (n.d.). ESPA guide to working with Theory of Change for research projects. Edinburgh: Ecosystem Services for Poverty Alleviation.

Thinking about relationships, power and politics

Coulby, H., Tiberghien, J.E., Barcelo, M. (n.d.). Power analysis tools for WASH governance. WaterAid and Freshwater Action Network. 

Hudson, D., Marquette, H., Waldock, S. (2016). Everyday Political Analysis. Birmingham: Developmental Leadership Program. 

Saunders, M. (2018). How to choose a framing narrative for scientific papers. Ecology Is Not A Dirty Word.

The Change Agency (n.d.). Power Mapping Template. Islington, NSW: The Change Agency.

Tracking contacts, networks and coalitions

Cvitanovic, C., Cunningham, R., Dowd, A-M., Howden, S.M., van Putten, E.I. (2017). Using Social Network Analysis to Monitor and Assess the Effectiveness of Knowledge Brokers at Connecting Scientists and Decision-Makers: An Australian case study. Environmental Policy and Governance 27: 256–269. 

Evernote (2020). 

Given, L.M. (2012). Research Diaries and Journals. In L.M. Givens. The SAGE Encyclopedia of Qualitative Research Methods. Thousand Oaks, CA: Sage Publications.

Schiffer, E., Hauck, J. (2010). Net-Map: Collecting Social Network Data and Facilitating Network Learning through Participatory Influence Network Mapping. Field Methods 22(3): 231-249. 

Younis, M. (2017). Evaluating Coalitions and Networks: Frameworks, Needs, and Opportunities. Washington, DC: Center for Evaluation Innovation. 

Tracking outcomes and impacts - exploring multiple pathways -NEW!

Overton is the world’s largest searchable database of full-text policy documents, containing 6m+ parliamentary transcripts, government guidance documents and think-tank outputs from 29,000 organisations in 185 countries. Use Overton to find who’s citing your own or other people’s research; identify or evidence impact case studies; add policy influence indicators to annual reports; and find new collaborators. Please note, first time users of Overton must access the site through the University's VPN to gain access.

Civicus (2019) Outcome Harvesting.

Meagher, L, Edwards, D. (2020). A framework to evaluate the impacts of research on policy and practice. Integration and Implementation Insights. 

Outcome Harvesting (2020).

Poirier, C., Tolmie, C. (2020). Measuring Governance, Advocacy and Power: A Database of Existing Indicators, Tools and Indices. Washington, DC: Results for Development. 

Wilson-Grau, R. (2015). Outcome Harvesting. Better Evaluation.

Make space for learning and reflection

Cabaj, M. (2019). Evaluating Systems Change Results: An Inquiry Framework. Waterloo, ON: Tamarack Institute.

Franz, N.K. (2013). The Data Party: Involving Stakeholders in Meaningful Data Analysis. Journal of Extension 51(1).

Mayne, J. (2008). Contribution Analysis: An approach to exploring cause and effect. ILAC methodological brief.

NHS (n.d.). Knowledge sharing cultures. Guidance and tools to support systematic learning before, during and after project activity in health and care. London: National Health Service. 

NHS (n.d.). Learning Handbook: Guidance and tools to support systematic learning before, during and after project activity in health and care. London: National Health Service. 

ODI (2012). RAPID outcome assessment. London: Overseas Development Institute.

Ramalingam, B. (2006). Tools for Knowledge and Learning. ODI Toolkit. London: Overseas Development Institute.

Reeler, D. (n.d.). Sharing Experience and Learning Together. Tearfund.

Ensure evaluation is fit for purpose

Cassidy, C., Ball, L. (2018). Communications monitoring, evaluating and learning toolkit. London: Overseas Development Institute.

Centres for Disease Control and Prevention (2017). Evaluation Reporting: A Guide to Help Ensure Use of Evaluation Findings. Washington, DC: US Department of Health & Human Services.

Innovation Network (n.d.). Pathfinder: A Practical Guide to Advocacy Evaluation.

Pasanen, T., Shaxson, L. (2016). How to design a monitoring and evaluation framework for a policy research project. London: Overseas Development Institute.

Schmitt, J. (n.d.). The use of causal mechanisms in complex evaluations. Methods and Standards. Bonn: German Institute for Development Evaluation.

Other

Barnett, C., Gregorowski, R. (2013). Learning about Theories of Change for the Monitoring and Evaluation of Research Uptake. IDS Practice Paper in Brief 14. Brighton: Institute of Development Studies.

Befani, B. (2013). Between complexity and generalization: Addressing evaluation challenges with QCA. Evaluation 19(3): 269–83.

Befani, B., and Mayne, J. (2014). Process Tracing and Contribution Analysis: A Combined Approach to Generative Causal Inference for Impact Evaluation. IDS Bulletin 45(6).

Collier, D. (2011). Understanding Process Tracing. Political Science and Politics 44(4): 823-830. 

Edwards, D.M., Meagher, L.R. (2020). A framework to evaluate the impacts of research on policy and practice: A forestry pilot study. Forest Policy and Economics 114.

Grove, J.T. (2015). Aiming for utility in ‘systems-based evaluation’: A research-based framework for practitioners. IDS Bulletin 46(1): 58–70.

INTRAC (2017). Resources for M&E. Oxford: INTRAC. 

Oliver, K., Faul, M.V. (2018). Networks and network analysis in evidence, policy and practice. Evidence & Policy 14(3): 369-379.

University of Nottingham (n.d.). Routes to policy impact: A practical guide for academics and researchers

Wilson, K.M., Brady, T.J., Lesesne, C. (2011). An Organizing Framework for Translation in Public Health: The Knowledge to Action Framework. Preventing Chronic Disease 8(2): A46.

Resources

Research-policy linkages

Boswell, C., Smith, K. (2017). Rethinking policy ‘impact’: four models of research policy relations. Palgrave Communications 3, 44:1-10. 

Broadbent, E. (2012). Politics of research-based evidence in African policy debates: Synthesis of case study findings. Evidence-based Policy in Development Network. London: Overseas Development Institute. 

Cairney, P. (2013). What is evolutionary theory and how does it inform policy studies?. Policy & Politics 41(2): 279-298.

Cairney, P. (2016). The Politics of Evidence-Based Policy Making. London: Palgrave Macmillan.

Donadelli, F. (2020). When evidence does not matter: The barriers to learning from science in two cases of environmental policy change in Brazil. Science and Public Policy. Oxford: Oxford University Press.

Georgalakis, J., Rose, P. (Eds.) (2019). Exploring Research–Policy Partnerships in International Development. IDS Bulletin 50(1). Brighton: Institute of Development Studies. 

Hardiman, N., Metinsoy, S. (2019). Evidence matters, but ideas shape policy in more fundamental ways than we might realise. London: LSE Impact Blog. 

Hayden, M.C., Petrova, M.K., Wutti, D. (2018). Direct associations of the terminology of knowledge transfer – Differences between the social sciences and humanities (SSH) and other scientific disciplines. Trames Journal of the Humanities and Social Sciences 22(3): 239-256.

Head, B.W. (2015). Toward More “Evidence-Informed” Policy Making?. Public Administration Review 76(3): 472-484.

Langer, L., Tripney, J., Gough, D. (2016). The Science of Using Science: Researching the Use of Research Evidence in Decision-Making. London: UCL Institute of Education. 

Newman, J., Cherney, A., Head, B.W. (2017). Policy capacity and evidence-based policy in the public service. Public Management Review 19(2): 157-174. 

Olson, J., da Silva, P.P. (2019). Knowledge production at the science–policy interface: Lessons from fisheries scientists. Science and Public Policy, 47(1), 45-66

Orensten, N., Buteau, E., Martin, H., Gehling, K. (2020). Policy Influence: What Foundations are Doing and Why. Cambridge, MA and San Francisco, CA: The Center for Effective Philanthropy.

Parkhurst, J. (2017). The Politics of Evidence: From evidence-based policy to the good governance of evidence. Abingdon: Routledge.

Rittel, H.W.J., Webber, M.M. (1973) Dilemmas in a general theory of planning. Policy Sciences 4(2): 155–69.

University of Cambridge Public Policy SRI (2017) Policy Impact: A ‘how to’ guide for researchers. Cambridge: University of Cambridge.

Engagement strategies for effective research uptake and its monitoring and evaluation

Basbøll, T. (2018). We need our scientists to build models that frame our policies, not to tell stories that shape them. London: LSE Impact Blog. 

Bornmann L. (2013). What is societal impact of research and how can it be assessed? A literature survey. Journal of the American Society for Information Science and Technology 64(2): 217–233.

Cairney, P., Oliver, K. (2018). How Should Academics Engage in Policymaking to Achieve Impact?. Political Studies Review 18(2): 228-244. 

Cartwright, N., Hardie, J. (2012). Evidence-Based Policy: A Practical Guide to Doing It Better. Oxford University Press.

Dennison, J. (2020). What policy communication works for migration? Using values to depolarise. Florence: European University Institute. 

Dinesh, D., Zougmoré, R.B., Vervoort, J.M., Totin, E., Thornton, P.K., Solomon, D., Shirsath, P., Pede, V., López Noriega, I., Läderach, P., Körner, J., Hegger, D., Girvetz, E.H., Friis, A.E., Driessen, P.P.J., Campbell, B.M. (2018). Facilitating Change for Climate-Smart Agriculture through Science-Policy Engagement. Sustainability 10(8), 2616.

Fooks, L. (2019). Humanities and Policy Engagement: An introduction for researchers. The Oxford Research Centre in the Humanities. Oxford: University of Oxford.

Funnell, S., Rogers, P. (2011), Purposeful Program Theory: Effective Use of Theories of Change and Logic Models. San Francisco, CA: Jossey-Bass. 

Malaria Consortium (2016). Guide to developing and monitoring: a research uptake plan. London: Malaria Consortium Headquarters.

Moore, G., Todd, A., Redman, S. (2009). Strategies to increase the use of evidence from research in population health policy and programs: a rapid review. Sydney: Sax Institute.

Newman, J., Cherney, A., Head, B.W. (2016). Do Policy Makers Use Academic Research? Reexamining the “Two Communities” Theory of Research Utilization. Public Administration Review 76(1): 24-34.

Nutley, S.M., Walter, I., and Davies, H.T.O. (2007). Using Evidence How Research Can Inform Public Services. Bristol: The Policy Press.

Ostrom, E. (2007). A diagnostic approach for going beyond panaceas. Proceedings of the National Academy of Sciences of the United States of America, 104(39), 15181-15187.

Ramalingam, B., Wild, L., Buffardi, A.L. (2019). Making adaptive rigour work: Principles and practices for strengthening monitoring, evaluation and learning for adaptive management. London: Overseas Development Institute.

Rose, D.C., Evans, M.C., Jarvis, R.M. (2020). Chapter Ten - Effective engagement of conservation scientists with decision-makers. In W.J. Sutherland, P.N.M. Brotherton, Z.G. Davies, N. Ockendon, N., Pettorelli, N. & J.A. Vickery (Eds.). Conservation Research, Policy and Practice (Ecological Reviews). Vol. 1: 162-182. Cambridge: Cambridge University Press.

Sasse, T., Haddon, C. (2019) How academia can work with government. London: Institute for Government. 

Stewart, R., Langer, L., Erasmus, Y. (2018). An integrated model for increasing the use of evidence by decision-makers for improved development. Development Southern Africa 36(5): 616-631. 

Šucha, V., and Sienkiewicz, M. (2020). Science for Policy Handbook. Oxford: Elsevier. 

University of Oxford Public Engagement with Research Team (2015) Public Engagement with Research Strategic Plan. Oxford: University of Oxford. 

University of Oxford Policy Engagement Team (2020). Guidance on Policy Engagement Internationally

Wallis, K. (2020). How an audience-first approach to social media increases engagement with your research. London: LSE Impact Blog.

Evaluation, complexity, and systems

Barbrook-Johnson, P., Proctor, A., Giorgi, S. (2020). How do policy evaluators understand complexity?. Evaluation 26(3): 315-332.

Byrne, D. (2013). Evaluating complex social interventions in a complex world. Evaluation 19(3): 217-228.

Fletcher, A., Jamal, F., Moore, G., Evans, R. E., Murphy, S., Bonell, C. (2016). Realist complex intervention science: Applying realist principles across all phases of the Medical Research Council framework for developing and evaluating complex interventions. Evaluation, 22(3), 286–303.

Marra, M. (2015). Cooperating for a more egalitarian society: Complexity theory to evaluate gender equity. Evaluation 21(1): 32-46.

Moore, G.F., Evans, R.E., Hawkins, J., Littlecott, H., Melendez-Torres, G.J., Bonell, C., Murphy, S. (2019). From complex social interventions to interventions in complex social systems: Future directions and unresolved questions for intervention development and evaluation. Evaluation 25(1): 23-45.

Mowles, C. (2014). Complex, but not quite complex enough: The turn to the complexity sciences in evaluation scholarship. Evaluation 20(2): 160-175. 

Patton, M. (2010). Developmental evaluation applying complexity concepts to enhance innovation and use. New York, NY: Guilford Press.

Sanderson, I. (2000). Evaluation in Complex Policy Systems. Evaluation 6(4): 433-454.

Westhorp, G. (2012). Using complexity-consistent theory for evaluating complex systems. Evaluation 18(4): 405–20.

Woolcock, M (2013). Using case studies to explore the external validity of ‘complex’ development interventions. Evaluation 19(3): 229–48. 

Navigating funding regimes

Boaz, A., and Hanney, S. (2020). The role of the research assessment in strengthening research and health systems. London: LSE Impact Blog. 

Chowdhury G., Koya K., Philipson P. (2016) Measuring the Impact of Research: Lessons from the UK’s Research Excellence Framework 2014. PLOS ONE 11(6): e0156978.

Gaskell, G. (2020). Nine steps to achieve research integrity and build trust. London: LSE Impact Blog.

Hill, S. (2016). Assessing (for) impact: future assessment of the societal impact of research. Palgrave Communications 2(16073)

Jogalekar, A. (2014). Are we entering a golden era of private science funding?. New York City, NY: Scientific American.

King's College London, Digital Science (2015). The nature, scale and beneficiaries of research impact: An initial analysis of Research Excellence Framework (REF) 2014 impact case studies. Research Report 2015/01. Bristol: Higher Education Funding Council of England.

Nurse, P. (2015). Ensuring a successful UK research endeavour: A Review of the UK Research Councils. Independent Report. Department for Business, Innovation & Skills. London: Her Majesty’s Government.

Reed, M., Simon, K. (2017). How much as an impact case study worth in the UK Research Excellence Framework? Fast Track Impact Blog.

REF (2019). Guidance on submissions. Research Excellence Framework.

Smith, K. E., Bandola-Gill, J., Meer, N., Stewart, E., Watermeyer, R. (2020). The Impact Agenda: Controversies, Consequences and Challenges. Bristol: The Policy Press.

Stern, N., Sweeney, D. (2020). Institutions must be bold with impact in REF 2021. Bristol: REF 2021 Blogs.

Weir, N. (2014). A short history of science funding: Why was CaSE created and why is it still here?. London: The Biochemist 36(4): 4-6. 

Key websites

Better Evaluation
Covers work on 'evaluation' which includes the full range of monitoring and evaluation activities, frameworks, and systems, as well as guidance on evaluations of projects, programs, policies, products, networks, organisations, or strategies. It also explores evaluations that are known by different labels, such as impact assessment, impact measurement, or return on investment.

LSE Impact blog
A hub for researchers, administrative staff, librarians, students, think tanks, government, and anyone else interested in maximising the impact of academic work in the social sciences and other disciplines. It covers key debates, shares best practice and keeps the impact community up to date with news, events and the latest research.

Paul Cairney’s blog
Wide range of material on the politics of public policy, and the politics of evidence-based policy making with a particular focus on the UK.

Research to Action
Caters to the strategic and practical needs of people trying to improve the uptake of development research, in particular those funded by UK Government’s Foreign, Commonwealth and Development Office (FCDO), and previously the Department for International Development (DFID).

 Acknowledgements

This guidance was prepared by Professor Chris Roche, Alana Tomlin and Ujjwal Krishna, in consultation with the University’s Policy Engagement Team. They would like to thank all those who shared relevant experience and insights, in particular: Dr Ariell Ahearn Ligham, Dr Richard Baxter, Dr Anuj Bhatt, Professor Lucie Cluver, Dr Annaleise Depper, Dr Amy Hinsley, Dr Elizabeth Hodges, Professor Susan Jebb, Professor Katrin Kohl, Dr Ian Lyne, Professor Peter McCulloch, Maria Michalopoulou, Charlotte Medland, Dr Vicky McGuinness, Dr Anne Mortimer, Dr Kathryn Oliver, Dr Koen Pouwels, Dr Francesca Richards, Laura Mason, Stephen Meek, Dr Sharron Pleydell-Pearce, Jess Ryan-Phillips, Dr James Watkins, Professor Catherine Schenk and Dr Caroline Wood.

The Policy Engagement team would welcome feedback on this guidance. This can be sent to [email protected].