The South Africa Monitoring and Evaluation Association conference starts on Wednesday this week, with the theme ‘Improving use and results’. On Thursday, the programme includes a session called ‘Made in Africa: evaluation for development’, exploring values and diversity in development evaluation. To kick off discussion, we asked Benita Williams, an evaluator from Pretoria, South Africa, about how her values affect her evaluation work.
Collaborative Outcomes Reporting (COR) is an approach to impact evaluation that combines elements of several rigorous non-experimental methods and strategies. You’ll find it on the Approaches page on the BetterEvaluation site - an approach combines several options to address a number of evaluation tasks. This week we talk to Jess Dart, who developed COR. Jess is the new steward for BetterEvaluation’s COR page and, together with Megan Roberts from Clear Horizon, has provided a step-by-step guide, advice on choosing and using the approach well, and examples of its use.
This week BetterEvaluation is at the Australasian Evaluation Society conference in Brisbane, Australia, where the theme is "Evaluation shaping a better future: Priorities, pragmatics, priorities and power".
BetterEvaluation is improving and we would like you to be involved. We are redesigning the home page and the Start Here section and for the latter, we are seeking feedback on what to include.
In this week’s blog we interview Wouter Rijneveld, a consultant working on measurement and utilisation of results, mainly in international development. He recently published a paper on the use of the Social Return on Investment approach in Malawi and we wanted to find out about his experience of using this less-reported approach. We were doubly interested when he told us that he was initially skeptical about SROI.
An evaluation usually involves some level of generalising of the findings to other times, places or groups of people. If an intervention is found to be working well then we could generalise to say that it will continue to work well, or it will work well in another community, or when expanded to wider populations. But how far can we generalise from one or more case studies? And how do we go about constructing a valid generalisation? In this blog, Rick Davies explores a number of different types of generalisation and some of the options for developing valid generalisations.
In the second part of our mini-series on monitoring and evaluating policy influence, Arnaldo Pellini, Research Fellow at the Overseas Development Institute, explores a project supporting research centres in Australia to monitor their impact on health policy in Southeast Asia and the Pacific. Arnaldo explores the main challenges and makes some recommendations for others looking at the M&E of policy influence.
This two part mini-series looks at monitoring and evaluation of policy influencing and advocacy. This blog introduces a great new paper from Oxfam America exploring this topic from an NGO perspective and the second blog will present the perspective of a research programme.
In our third blog on mixed methods in evaluation, Tiina Pasanen from ODI focusses in on impact evaluations (IEs) – a specific type of evaluation with a lot of attention in international development right now, with hundreds being conducted every year. The clear majority of them are based on quantitative data and econometric analysis. There is much talk about the importance of combining methods to triangulate results and to better understand why something works, but in reality these mixed methods IE designs are still rare and are often failing to provide enough information for readers to follow and assess what has been done and why. As the number of mixed methods IEs is likely to grow in the next few years, should there be minimum standards as to what constitutes as a mixed methods design?
Willy Pradel and Gordon Prain from the International Potato Centre in Lima, Peru and Donald Cole from the University of Toronto discuss the evaluation they recently conducted which applied a mixed-methods approach to capture and understand a wide variety of changes to organic markets in the Central Andes region. This case demonstrates a good rationale for choosing a mixed-method design and also an authentic implementation that effectively mixes quantitative and qualitative data to enhance the value of each.