An iterative evaluation design begins with an initial design or process from which a more detailed design is created iteratively as the evaluation progresses in response to emerging findings and information needs.
This is an alternative to upfront evaluation design, where the evaluation design is done before or near the beginning of the evaluation and then implemented as designed or as revised at the end of the inception period.
An iterative evaluation design can be used to evaluate a single project or program. This might involve starting with an analysis of existing data and then undertaking strategic collection and analysis of additional data to fill in gaps and test alternative explanations. This process is used in contribution analysis, process tracing, and outcome harvesting.
An iterative evaluation design is needed when an evaluation is intended to support ongoing adaptation and responsiveness to emerging issues or emerging understandings. This is the process used in developmental evaluation.
An iterative evaluation design can also be used as part of a long-term evaluation which is intended to support learning in terms of emerging policy issues and changes in priorities.
Examples
Evaluation of Stronger Families and Communities Strategy, Australia 2000-2004
This evaluation used an iterative evaluation design to evaluate $80 million in funding to 635 projects funded under 7 funding initiatives as part of a 3-year national strategy to strengthen families and communities. The evaluation framework was based on an overall structure as follows:
- Level 1 data: Data collected from all projects – progress and final reporting in terms of performance indicators and separate reports, and through Initial and Final Questionnaires for the evaluation,
- Level 2 papers: Issue-focused papers that linked research evidence, policy frameworks and data from a cluster of projects, largely involving analysis of available information and illustrations from Strategy projects,
- Level 3 studies: Case studies of specific projects, communities, initiatives, or issues involving the collection of additional data as well as analysis of available data,
- Level 4 synthesis: The Strategy overall, including synthesis of other levels.
While the key evaluation questions remained largely constant throughout the evaluation, the specific details of the design were iteratively developed throughout the evaluation. Early level 3 case studies of successful projects were used to inform the development of level 2 issues papers and level 1 analysis of data from all projects. Issues identified in the analysis of level 1 data from all projects were used to inform the development of level 2 issues papers and level 3 case studies involving additional data collection. A new policy focus on economic and social participation that emerged during the course of the evaluation was chosen as a topic for a level 2 issues paper.
The topic of sustainability was identified early on as an important issue. This was addressed through an early level 2 issue paper that drew on research and practice literature and examples from Strategy projects to clarify different types of sustainability after the end of funding, including ongoing activities and enduring outcomes after funding ended. This paper then informed the design for a level 3 case study on sustainability which collected and analysed data from a random sample of 113 completed projects.
Advice for choosing this method
Iterative evaluation design is likely to be appropriate when:
- Upfront evaluation design is not possible until analysis of existing or early data has clarified evidence gaps and alternative explanations that need to be investigated.
- New or revised evaluation priorities and questions are likely to emerge during the course of the evaluation.
- Those conducting the evaluation and those managing it can accommodate changes in plans – both technically and contractually.
Advice for using this method
Effective use of iterative evaluation design can be supported by:
- Nimble, clear and documented decision-making processes for selecting priorities for subsequent investigation, endorsing changes and elaborations to the evaluation design, and any revisions to contractual obligations and deliverables
- An evaluation team with a range of methodological expertise which can design and implement different designs
- Availability of resources that can be deployed in flexible ways – this might take the form of a contingency allocation in the budget, a time allocation of staff that can be deployed in different ways, or a staged approach where a design is developed for the next phase based on what has been learned and what is most important to learn.
Resource
Sources
CIRCLE. (2005). Evaluation of the Stronger Families and Communities Strategy 2000-2004.
McKegg, K. & Wehipeihana, N. (2016). Development Evaluation in Synthesis: Practitioners’ Perspectives. In Patton, M. W., McKegg, K. and Wehipeihana, N. (eds) Developmental Evaluation Exemplars: Principles in Practice. New York: Guildford Press. p. 287.
Patton, M. Q and Britt, H. (2012). Budgeting for Developmental Evaluation (DE). https://www.betterevaluation.org/tools-resources/budgeting-for-developmental-evaluation-de
'Iterative evaluation design' is referenced in:
Approach
Blog
Framework/Guide
- Rainbow Framework :
Theme