Search
26 results
Filter search resultsRealist synthesis: an introduction
This guide, written by Ray Pawson, Trisha Greenhalgh, Gill Harvey and Kieran Walshe for the ESRC Research Methods Programme, provides an introduction to using realist synthesis, with a focus on hResourceLearning from research: Systematic reviews for informing policy decisions
This guide from the Alliance for Useful Evidence is an introduction to systematic review and the necessary steps that should be considered as a part of the process.ResourceInnovations in evaluation: How to choose, develop and support them
This brief opens up some of the issues and questions about why and how to adopt innovations in evaluation, and discusses how innovations can be useful in addressing eight long standing challenges in evaluation.ResourceA short primer on innovative evaluation reporting
This book by Kylie Hutchinson presents a number of innovative ways of reporting, including different methods for presentations, narrative summaries, presenting findings visually and making use of digital outputs.ResourceGlobal innovations in measurement and evaluation
This report by NPC highlights their research into the latest developments in theory and practice in measurement and evaluation. The authors found that new thinking, techniques, and technology are influencing and improving practice.ResourceTheory maker
This free and open-sourced web-based tool was made by Steve Powell as a quick and simple way of creating a theory of change. The information provided was supplied by Steve Powell.ResourceDylomo
Dylomo is a free, web-based tool that can be used to create interactive, online logic models.ResourceMethods for conducting systematic reviews
This guide, from the EPPI-Centre, looks at the processes involved when conducting Systematic Reviews. Covering the key steps involved, the guide focuses on four main area from apResourceChoosing appropriate designs and methods for impact evaluation - Department of Industry, Innovation and Science
The Department of Industry, Innovation and Science has commissioned this report to explore the challenges and document a range of possible approaches for the impact evaluations that the department conducts.ResourceEmerging Opportunities: Monitoring and Evaluation in a Tech-Enabled World
Emerging Opportunities: Monitoring and Evaluation in a Tech-Enabled World, a discussion paper written by Linda Raftree and Michael Bamberger under a grant from The Rockefeller Foundation to Itad, provides an overview of how the practice ofResource3D Impact Analysis: A New Tool to Approach Impact Evaluations
In this seminar, Rob D. van den Berg proposes an approach to ‘3D impact analysis’ which starts from the recognition that demand for impact evidence is wide ranging and should be analysed structurally before it can be met by evaluations.ResourceIntroducing systematic reviews
This is Chapter 1 of the book An Introduction to Systematic Reviews.ResourceWeek 7: Innovation in evaluation
This is the first in a series of blogs on innovation which includes contributions from Thomas Winderl and Julia Coffman.BlogWeek 8: Guest blog: Innovation in development evaluation
Development aid is changing rapidly – so must development evaluation.BlogWeek 9: Innovation in evaluation part 3: what’s the latest in advocacy evaluation?
Julia Coffman is Director of the Centre for Evaluation Innovation. In the third blog of our innovation in evaluation series, she looks some recent innovations in a notoriously tricky area: advocacy evaluation.BlogWeek 12: Evaluation innovation in transparency and accountability
Innovation is a relative concept. It is about new practice … for the topic and person or group in question.BlogHow to choose, develop, and support innovation in evaluation
This blog is an abridged version of the brief Innovations in evaluation: How to choose, develop and support them, written by Patricia Rogers and Alice Macfarlan.BlogGlobal innovations in measurement and evaluation
This guest blog is by Anoushka Kenley from NPC, who is one of the authors of NPC's recent report on Global Innovations in Measurement and Evaluation.BlogAnalyzing cause and effect in environmental assessments: Using weighted evidence from the literature
This article describes the Eco Evidence analysis framework, a type of causal criteria analysis that uses available evidence to assess support for a hypothesis.ResourceL’évaluation en contexte de développement
Ce manuel est destiné aux personnes souhaitant s’initier à l’évaluation de programmes, en particulier en contexte de développement et de coopération internationale. À cet égard, tout en déroulant le fil d’une démarche évaluative classique, il présente…ResourceChallenges for evidence-based environmental management: What is acceptable and sufficient evidence of causation?
This paper explores the use of the Eco Evidence framework in answering the question "what is acceptable and sufficient evidence of causation?" in environmental assessments.ResourceEnhancing program performance with logic models
Developed by the University of Wisconsin Extension service, this resource provides an introduction to developing and using a particular version of the results chain.ResourceAn accountability framework for technological innovation
This brief provides a range of recommendations to support organisations that are conducting research and design (R & D) to be more accountable.ResourceMeta-analysis
Meta-analysis is a statistical method for combining numeric evidence from experimental (and sometimes quasi-experimental studies) to produce a weighted average effect size.MethodW.K. Kellogg Foundation logic model guide
The W.K. Kellogg Foundation Logic Model Guide shows how to develop a results chain model.ResourceSystematic reviews
This video lecture given by Dr Philip Davies for the Asian Development Bank (ADB) and the International Initiative for Impact Evaluation (3ie) provides guidance for using a comprehensive systematic review to present the balance of researchResource