This statement from the European Evaluation Society (EES) argues that randomised control trials are not necessarily the best way to ensure a rigorous or scientific impact evaluation assessment (IE) for development and development aid.
The paper contends that multi-method approaches to IE are more effective than any single method.
"In contrast, the EES supports multi-method approaches to IE and does not consider any single method such as RCTs as first choice or as the 'gold standard':
- The literature clearly documents how all methods and approaches have strengths and limitations and that there are a wide range of scientific, evidence-based, rigorous approaches to evaluation that have been used in varying contexts for assessing impact.
- IE is complex, particularly of mulit-dimensional interventions such as many forms of development (e.g. capacity building, global budget support, sectoral development) and consequently requires the use of a variety of different methods that can take into account rather than dismiss this inherent complexity.
- Evaluation standards and principles form across Europe and other parts of the world do not favour a specific approach or group of approaches - although they may require that the evaluator give reasons for selecting a particular evaluation design or combination.
RCTs represent one possible approach for establishing impact, that may be suitable in some situations, e.g.:
- With simple interventions where a linear relationship can be established between the intervention and an expected outcome that can be clearly defined;
- Where it is possible and where it makes sense to 'control' for context and other intervening factors (e.g. where contexts are sufficiently comparable);
- When it can be anticipated that programmes under both experimental and control conditions can be expected to remain static (e.g. not attempt to make changes or improvements) often for a considerable period of time;
- Wher it is possible and ethically appropriate to engage in randomisation and to ensure the integrity of the differences between the experimental and control conditions." (EES, 2007)
European Evaluation Society. (2007). The importance of a methodologically diverse approach to impact evaluation. Retrieved from: https://europeanevaluation.org/wp-content/uploads/2020/03/EES-Statement-on-methodological-diversity.pdf
'The importance of a methodologically diverse approach to impact evaluation' is referenced in: