Search
14 results
Filter search resultsCommunicating evaluation findings
BlogAdapting evaluation in the time of COVID-19 - Part 1: Manage
Organisations around the world are quickly having to adapt their programme and project activities to respond to the COVID-19 pandemic and its consequences. We’re starting a new blog series to help support these efforts.BlogAdapting evaluation in the time of COVID-19 — Part 3: Frame
Evaluation needs to respond to the changes brought about by the Covid-19 pandemic. As well as direct implications for the logistics of collecting data and managing evaluation processes, the pandemic has led to rapid changesBlogAdapting evaluation in the time of COVID-19 – Part 4: Describe
We’re continuing our series, sharing ideas and resources on ways of ensuring that evaluation adequately responds to the new challenges during the pandemic.Blog52 weeks of BetterEvaluation: Week 47: using video to communicate evaluation findings
In the last in our series of blogs on using video in evaluation, Glenn O'Neil joins us to discuss how you can use video to communicate your evaluation findings.BlogWeek 15: Fitting reporting methods to evaluation findings – and audiences
This week we're sharing some ideas from Rakesh Mohan on ways of making evaluation reports more interesting.Blog7 Strategies to improve evaluation use and influence - Part 1
What can be done to support the use of evaluation? How can evaluators, evaluation managers and others involved in or affected by evaluations support the constructive use of findings and evaluation processes?BlogEvaluation report checklist
This checklist was developed by drawing upon and reflecting on The Program Evaluation Standards which were created for the Joint Committee on Standards for Educational Evaluation, 1994.ResourceMaking change happen: Advocacy and citizen participation
This paper provides an overview of issues related to advocacy and citizen participation, and may serve as a starting point for evaluation of the advocacy efforts of an organisation.ResourceRegression discontinuity
Regression Discontinuity Design (RDD) is a quasi-experimental evaluation option that measures the impact of an intervention, or treatment, by applying a treatment assignment mechanism based on a continuous eligibility index which is a variaMethodQuasi-experimental methods for impact evaluations
This video lecture, given by Dr Jyotsna Puri for the Asian Development Bank (ADB) and the International Initiative for Impact Evaluation (3ie), demonstrates how the use of quasi-experimental methods can circumvent the challenge of creatingResourceAdapting evaluation in the time of COVID-19 — Part 2: Define
The Covid-19 pandemic has led to rapid changes in the activities and goals of many organisations, whether these relate to addressing direct health impacts, the consequential economic and social impacts or to the need to change the way thingBlogQuasi-experimental design and methods
This guide, written by Howard White and Shagun Sabarwal for UNICEF looks at the use of quasi-experimental design and methods in impact evaluation.ResourceUNICEF webinar: Quasi-experimental design and methods
What is the main difference between quasi-experiments and RCTs? How can I measure impact when establishing a control group is not an option?Resource