Can we obtain the required rigour without randomisation? Oxfam GB’s non-experimental Global Performance Framework

This paper, written by Karl Hughes and Claire Hutchings for the International Initiative for Impact Evaluation (3ie), looks at Oxfam GB's practice of annual random selection of projects for evaluation including the extent to which they have promoted change in relation to a particular global outcome indicator. The paper also looks at four other options that can be used by NGOs to better understand their impact.  


"Non-governmental organisations (NGOs) operating in the international development sector need credible, reliable feedback on whether their interventions are making a meaningful difference but they struggle with how they can practically access it. Impact evaluation is research and, like all credible research, it takes time, resources, and expertise to do well, and – despite being under increasing pressure – most NGOs are not set up to rigorously evaluate the bulk of their work. Moreover, many in the sector continue to believe that capturing and tracking data on impact/outcome indicators from only the intervention group is sufficient to understand and demonstrate impact. A number of NGOs have even turned to global outcome indicator tracking as a way of responding to the effectiveness challenge. Unfortunately, this strategy is doomed from the start, given that there are typically a myriad of factors that affect outcome level change. Oxfam GB, however, is pursuing an alternative way of operationalising global indicators. Closing and sufficiently mature projects are being randomly selected each year among six indicator categories and then evaluated, including the extent each has promoted change in relation to a particular global outcome indicator. The approach taken differs depending on the nature of the project. Community-based interventions, for instance, are being evaluated by comparing data collected from both intervention and comparison populations, coupled with the application of statistical methods to control for observable differences between them. A qualitative causal inference method known as process tracing, on the other hand, is being used to assess the effectiveness of the organisation’s advocacy and popular mobilisation interventions. However, recognising that such an approach may not be feasible for all organisations, in addition to Oxfam GB’s desire to pursue complementary strategies, this paper also sets out several other realistic options available to NGOs to step up their game in understanding and demonstrating their impact. These include: 1) partnering with research institutions to rigorously evaluate “strategic” interventions; 2) pursuing more evidence informed programming; 3) using what evaluation resources they do have more effectively; and 4) making modest investments in additional impact evaluation capacity." (Hughes and Hutchings, 2011)


  • NGOs and the Effectiveness Challenge 1 
  • Flirting with global outcome indicators 2 
    • How to Demonstrate Effectiveness Ineffectively and at Great Cost 2 
    • Setting Out on a Road Once Travelled 3 
  • Working out a workable compromise: Oxfam GB’s global performance framework 3 
    • But It Ain’t Just about Indicators!  3 
    • Project Effectiveness Auditing: An Alternative Way of Operationalising Global Indicators 4 
  • Horses for Courses: The Large and Small n Divide 5 
    • Choosing the Right Causal Inference Tool for the Job 5 
    • Mimicking Experiments Non-experimentally 6 
    • Searching for Signatures and Smoking Guns 7 
  • Harnessing Potential for Organisational Learning 8 
  • Are There Other Options?  9 
  • Concluding Thoughts 12 
  • ANNEX 1: Oxfam GB’s global outcome indicators 13 
  • ANNEX 2: Effectiveness audit pilot summary – Tanzania agricultural scale-up 14 
  • ANNEX 3: Effectiveness Audit Pilot Summary – Policy Influence in Africa 15


Hughes, K. and Hutchings, C. (2011). Can we obtain the required rigour without randomisation? Oxfam GB’s non-experimental Global Performance Framework, International Initiative for Impact Evaluation (3ie). Retrieved from: