Understand Causes

Most evaluations require ways of addressing questions about cause and effect – not only documenting what has changed but understanding why.   

Impact evaluation, which focuses on understanding the long-term results from interventions (projects, programs, policies, networks and organisations), always includes attention to understanding causes.  

Understanding causes can also be important in other types of evaluations.  For example in a process evaluation, there often needs to be some explanation of why implementation is good or bad in order to be able to suggest ways it might be improved or sustained. 

In recent years there has been considerable development of methods for understanding causes in evaluations, and also considerable discussion and disagreement about which options are suitable in which situations. 

When choosing between these different options, consider the different types of causal inference that might be involved: 

  • One cause producing one effect – it is necessary and sufficient to produce the effect 

  • Two or more causes combining to produce an effect (for example, two programs or a program when combined with other factors such as particular participant characteristics) – one of the causes alone is necessary but not sufficient 

  • Two or more causes being alternative ways of producing an effect – either of them are sufficient and neither is necessary  

Different labels might be used for these different types of causal relationship  ‘causal attribution’ implying a single cause, ‘causal contribution’  implying a package of causal factors, and  ‘causal inference’ being used to refer to all of these. 

It is also important to consider the different types of questions that might be asked about cause and effect: 

  • Did the intervention make a difference? 

  • For whom, in what situations, and in what ways did the intervention make a difference? 

  • How much of a difference did the intervention make? 

  • To what extent can a specific impact be attributed to the intervention? 

  • How did the intervention make a difference? 

To explore the different ways of understanding causes in an evaluation, download the overview which lists different methods, designs, processes and approaches. You can also explore the following three broad strategies for causal inference. 


1. Check the results are consistent with causal contribution

This strategy should be part of all evaluations that include causal questions.  There are a number of options and approaches that can be used to check that the data are consistent with what would be expected if the intervention were contributing to producing the observed changes. 

2. Compare the results to the counterfactual

This strategy is appropriate in some but not all evaluations.  There are a number of options and approaches that can be used to develop a counterfactual - an estimate of what would have happened without the intervention - and to compare that to the findings of what happened with the intervention. 

3. Investigate possible alternative explanations

This strategy should be part of all evaluations that include causal questions.  There are a number of options and approaches that can be used to identify other factors that might have caused the impacts and to see if it is possible to rule them out. 


Recorded webinar: Jane Davidson's overview of options for causal inference in a 20 minute webinar in the American Evaluation Association's Coffee Break series.  Free to all, including non-members.  

Models of causality and causal inference.  Paper by Barbara Befani discussing different ways of thinking about causality and investigating cause and effect. 

Making causal claims.  Paper by John Mayne on the logic involved in thinking about multiple contributing factors to produce results. 


Anonymous's picture
Cassie Bell

Hi. I work for a public sector union in Canada that is halfway through implementing a five-year education plan. While formative evaluation has been undertaken by staff educators and their supervisors, the summative evaluation is only being developped now. What would be the most effective evaluation to use to determine impact and/or outcomes of the program? 


Patricia Rogers's picture
Patricia Rogers

Hi Cassie,

The most effective evaluation design or approach will depend on the nature of the intervention, the purposes of the evaluation and the availability of resources, especially time, money and existing data.  I suggest you check out our page on impact evaluation which provides guidance for working through the various issues you need to consider and options in terms of how you might go about it. https://www.betterevaluation.org/en/themes/impact_evaluation

Steve Montague's picture
Steve Montague

Hi Cassie / Patricia,

Sorry to have missed this in June. I would also point out that Patricia and her colleague Sue Funnell have written on the information/education archetype in terms of a theory of change (eg Purposeful Program Theory pg 352-357). At Carleton University here in Canada where I teach - we draw heavily on betterevaluation.org and on Purposeful Program Theory as core resources - and we have had many student projects looking at the cause-effect (impact) of information/educational programs. I recommend reviewing past work in this area. Perhaps because we are Canadian - we tend towards contribution analysis - a method developed by John Mayne based here in Ottawa and oriented to the careful application of program theory and the  idea of 'causal packages' and more recently very attuned to realist approaches applied at key links in the theory (causal pathway). As an example see  https://pubmed.ncbi.nlm.nih.gov/28557934/  . 'We' at Carleton had nothing to do with this one (but it is very much in line with our typical approach - see for example Figure 2) - so I feel I can recommend it without hesitation. (and let us know if you might be interested in a student practicum project down the road. Our teams have launched as of September for this year - but we will be looking again for next September.)

Hope this helps! best of luck with it.