Search
11 results
Filter search resultsCases in outcome harvesting
This report from The World Bank documents the pilot of a program that examines the use of outcome harvesting and the Bank's results management approach to understand how change happens in complex environments.ResourceDiscussion note: complexity aware monitoring
USAID’s Office of Learning, Evaluation and Research (LER) has produced a Discussion Note: Complexity-Aware Monitoring, intended for those seeking cutting-edge solutions to monitoring complex aspects of strategies and projects.Resource52 weeks of BetterEvaluation: Week 16: Identifying and documenting emergent outcomes of a global network
Global voluntary networks are complex beasts with dynamic and unpredictable actions and interactions. How can we evaluate the results of a network like this? Whose results are we even talking about?BlogDiscussion Paper: Innovations in Monitoring and Evaluation
This discussion paper produced by the United Nations Development Programme discusses various innovations that are occurring in M&E, and the advantages and disadvantages of these methods.ResourceRetrospective 'outcome harvesting': Generating robust insights
This paper describes the use of the Outcome Harvesting approach to evaluate a global voluntary network.ResourceOutcome harvesting
An impact evaluation approach suitable for retrospectively identifying emergent impacts by collecting evidence of what has changed and, then, working backwards, determining whether and how an intervention has contributed to these changes.ApproachWeek 11: BetterEvaluation at AfrEA 2014
BetterEvaluation was privileged to sponsor the Methodological Innovation stream at the African Evaluation Association (AfREA) conference from 3-7 March. What did we learn?BlogWeek 49: The 1st international conference on realist approaches to evaluation: my ‘realist’ take-aways
In this blog, Tiina shares her top three realist ‘take-aways’ from the 1st International Conference on Realist Approaches to Evaluation and reflects on when or how realist evaluation may be most useful.BlogWhat would an evaluation conference look like if it was run by people who know and care about presenting information to support use? (hint - that should be us)
All too often conferences fail to make good use of the experience and knowledge of people attending, with most time spent presenting prepared material that could be better delivered other ways, and not enough time spent on discussions and aBlogPotent Presentations Initiative (p2i) guidelines for creating better handouts
With a number of great conferences coming up fast on the horizon, we thought it would be an opportune time to share this article by Sheila B.BlogClearing the fog: New tools for improving the credibility of impact claims
This IIED Briefing Paper shows that the methods of process tracing and Bayesian updating can facilitate a dialogue between theory and evidence that allows for the assessing of the degree of confidence in ‘contribution claims’ in a transpareResource