Find resources by method or process in the Rainbow Framework
The BetterEvaluation Rainbow Framework sorts more than 300 evaluation methods and processes into 7 clusters of tasks: Manage, Define, Frame, Describe, Understand Causes, Synthesise, and Report and Support Use. Each method and process has a number of curated resources linked to it. You can navigate through the Rainbow Framework to learn more about the different evaluation tasks, methods and processes, and find resources this way.
Or, if you are simply looking for resources to do with a specific method or process, you can use this form.
Displaying 31 - 40 of 1826
This study aims to enhance the role of rural woman facing the negative impacts of climate change through benefiting from their local knowledge and gained experiences in areas of rainy mountains and heights in Yemen.
WISE's website organises a large amount of statistics resources available on the web into one central place. It also is home to a series of interactive, sequenced tutorials on key statistical concepts. The tutorials use dynamic applets that allow the user to explore relationships on their own. Guided exercises are designed to help the learner to take full advantage of the applets to gain a deeper understanding of the concepts and logic that underlie much of inferential statistics.
This blog post and its associated replies, written by Jed Friedman for the World Bank, describes a process of using analytic methods to overcome some of the assumptions that must be made when extrapolating results from evaluations to other settings. The blog post includes a number of detailed replies which further enhance the ideas initially presented.
This is a 20-minute presentation on evaluation use developed by Alexey Kuzmin (Process Consulting Company, Moscow, Russia) in collaboration with the ILO International Training Centre (ILO ITC, Turin, Italy).
This detailed guide provides investigators with a rigorous technical discussion of Cost Effectiveness Analysis (CEA) procedure, written from a public health perspective, as an option for assessing the efficiency of an intervention. Particular care is taken in the guidance to demonstrate how to maximise generalisability of results across settings.
This handbook from the World Health Organisation (WHO) provides step-by-step guidance to conducting evaluations in WHO. The book is broken into two parts: the first aimed at outlining the definition, objectives, principles and management of evaluation in WHO; the second provides detailed practical knowledge for conducting an evaluation that complies with WHO's evaluation policy. Included are a number of templates that can be used during the different phases of the evaluation process.
This workshop by Jeremy Holland for the Institute of Development Studies was streamed live on May 1st, 2014. It discusses the importance of participatory statistics, arguing that local people can generate their own numbers – and the statistics that result are powerful for themselves and can influence policy.
This resource is a detailed overview of study conducted by Lewis and Pattanayak (2012) on the systematic review of the literature surrounding the adoption of Improved Cook Stoves (ICS) or cleaner fuels by households in developing countries.
This article introduces papers from a 2016 forum “Where Impact Measurement Meets Evaluation” co-sponsored by the American Evaluation Association and Social Value International to discuss the intersection between evaluation and impact measurement.