Are you thinking about using Generative AI (Gen AI) to analyze your qualitative data? A principle-led analysis plan may help you navigate this process and make decisions that are both ethically sound and practical for your analysis task.
Heather Britt
Heather Britt is an independent evaluator specializing in uncertain, emergent, contested, and dynamic programming. She authored USAID’s 2013 Complexity-Aware Monitoring Discussion Note, and co-authored briefs on Outcome Harvesting and Causal Link Monitoring. As AEA SETIG co-chair, she led a collaborative process to define systems-informed evaluation principles. She chairs the American Evaluation Association’s International Working Group.
Content this member has contributed or contributed to
Blog
- Development actors are embracing the concept and practice of adaptive management, using evidence to inform ongoing revisions throughout implementation.
Resource
- This resource provides answers to a selection of frequently asked questions about the Causal Link Monitoring (CLM) approach.
- This series of webinars was first presented at the Causal Pathways Symposium 2023, which focused on "connecting, learning, and building a shared understanding of the evaluation and participatory practices that make causal pathways more visible"
- This session of the Causal Pathways Symposium 2023, by Heather Britt, introduced causal link monitoring, a method for integrating monitoring data and evaluation in order to address causality amid complexity.
- Causal Link Monitoring (CLM) integrates design and monitoring to support adaptive management of projects.
- This 27-page brief, written by Ricardo Wilson-Grau and Heather Britt, introduces the key concepts and approach used by Outcome Harvesting (published by the Ford Foundation in May 2012; revised in Nov 2013).
- An interview with internationally recognised evaluation expert Michael Quinn Patton by Heather Britt for BetterEvaluation, April 2012.
- This is a straightforward budget example that lists costs associated with four basic expenditure categories: staffing, materials and supplies, equipment and travel.
- Page 17 of this guide from Imagine Canada provides an example of an evaluation budget for a one-year project evaluation.
- This document is a USAID guide to developing an evaluation budget based on the resources identified in the evaluation Statement Of Work (SOW).
Method
- The resources available for evaluation include people’s time and expertise, equipment and funding.
- Reducing costs is something to consider if evaluation costs outweigh the predicted benefits or available resources.
- This strategy for securing sufficient resources for conducting evaluation involves allocating a specified amount of staff time (hours or days per week) to work on evaluation.
- This strategy requires management leadership and uses the rule of thumb approach to estimate the percentage of project funds to spend on evaluation.
- An evaluation budget matrix specifies various items that need to be costed as individual line items.
- Evaluation expenses are highly situational and there are no magic formulas for calculating costs.
- This strategy requires management leadership and uses the rule of thumb approach to estimate the percentage of project funds to spend on evaluation which could be done more accurately by developing an initial
- You may also consider approaching a foundation or other donor agency for the funds to undertake an evaluation.
- As many projects are undertaken by a consortium of organisations working together, sometimes it is worthwhile to consider approaching your implementing partners to pool resources and carry out the evaluation jointly.
Approach
- Causal link monitoring (CLM) is an approach to designing and implementing monitoring, evaluation and learning (MEL) systems that prioritise information for managing adaptively in complexity.
- Outcome Harvesting collects (“harvests”) evidence of what has changed (“outcomes”) and, working backwards, determines whether and how an intervention has contributed to these changes.