Rethinking impact evaluation for development - IDS special issue

This special edition of the IDS Bulletin presents contributions from the event 'Impact Innovation and Learning: Towards a Research and Practice Agenda for the Future', organised by IDS in March 2013.

The articles focus on different elements of impact evaluation and its challenges in the ever-complex work of development evaluation. 

Extract

"...understanding and evaluating the impact of international development – and especially attributing effectiveness to specific interventions – is an increasingly challenging, but at the same time pressing, concern: tight budgets, greater demands for accountability, and a gradual cultural shift towards evidence-based policy have all served to reinvigorate a focus on measurement and evaluation. As a consequence, particular evaluation methods, such as experiments and quasi-experiments, have received special attention for: (a) their ability to produce findings that can be assessed according to clear quality standards, and (b) their ability to demonstrate causal links between the intervention and outcomes. The position of many donors (as demonstrated in a series of methodological guides, such as Gertler et al. 2011; HM Treasury 2011 and USAID 2011) has more or less explicitly identified a hierarchy of methods, ranked by their degree of ‘rigour’, where rigour is broadly intended as lack of ‘bias’. At the top of this hierarchy lie randomised controlled trials, followed respectively by quasi-experiments, mixed methods and qualitative methods. More or less explicitly, these rankings postulate that: (a) quantitative methods hold a superior status in comparison with qualitative methods; and (b) causal inference is exclusively the attribution of one effect to one cause, where the cause is the intervention and the effect is the ‘net’ or additional effect attributable to the intervention."

Contents

  • Introduction – Rethinking Impact Evaluation for Development. Barbara Befani, Chris Barnett and Elliot Stern (pages 1–5)
  • Have Development Evaluators Been Fighting the Last War… And If So, What is to be Done? Robert Picciotto (pages 6–16)
  • Process Tracing and Contribution Analysis: A Combined Approach to Generative Causal Inference for Impact Evaluation. Barbara Befani and John Mayne (pages 17–36)
  • The Triviality of Measuring Ultimate Outcomes: Acknowledging the Span of Direct Influence. Giel Ton, Sietze Vellema and Lan Ge (pages 37–48)
  • Things you Wanted to Know about Bias in Evaluations but Never Dared to Think. Laura Camfield, Maren Duvendack and Richard Palmer-Jones (pages 49–64)
  • Making M&E More ‘Impact-oriented’: Illustrations from the UN. Jos Vaessen, Oscar Garcia and Juha I. Uitto (pages 65–76)
  • Some Thoughts on Development Evaluation Processes. Ole Winckler Andersen (pages 77–84)
  • Developing a Research Agenda for Impact Evaluation in Development. Patricia J. Rogers and Greet Peersman (pages 85–99)

Sources

Barbara Befani, Chris Barnett and Elliot Stern (eds) (2014). "Rethinking Impact Evaluation for Development", IDS Bulletin, Volume 45, Issue 6, Pages 1–99. Retrieved from: https://onlinelibrary.wiley.com/toc/17595436/2014/45/6

'Rethinking impact evaluation for development - IDS special issue' is referenced in: