Find resources by method or process in the Rainbow Framework
The BetterEvaluation Rainbow Framework sorts more than 300 evaluation methods and processes into 7 clusters of tasks: Manage, Define, Frame, Describe, Understand Causes, Synthesise, and Report and Support Use. Each method and process has a number of curated resources linked to it. You can navigate through the Rainbow Framework to learn more about the different evaluation tasks, methods and processes, and find resources this way.
Or, if you are simply looking for resources to do with a specific method or process, you can use this form.
Displaying 41 - 50 of 1826
This paper, written by Marcel P. J .M. Dijkers for the Task Force on Systematic Review and Guidelines, examines the approach in evidence-based practice of using only the "best available" evidence and argues that this strongly disadvantages rehabilitation programs because of their nature.
Many development programme staff have had the experience of commissioning an impact evaluation towards the end of a project or programme only to find that the monitoring system did not provide adequate data about implementation, context, baselines or interim results. This Methods Lab guidance note by Greet Peersman, Patricia Rogers, Irene Guijt, Simon Hearn, Tiina Pasanen and Anne L. Buffardi has been developed in response to this common problem.
This research paper uses Instrumental Variable Options to analyse whether "participation in out-of-school extracurricular activities improve academic achievement or behavior for elementary school children? [And] If so, are the impacts of participation related to the types of extracurricular activities that students pursue (e.g., music and arts, language, computer classes, sports)?" (Chaplin & Puma, 2003)
This report, conducted by the American Institutes for Research (AIR), WestEd and the Justice Resource Institute (JRI), reviews the evidence behind programs that have been designed to reduce serious violence among groups of young offenders. The report is a useful example of the use of Rapid Evidence Assessment (REA) in the methodology, using this approach to identify effective violence prevention strategies. The limitations of REA are as a methodology and in the context of this report are discussed on page 33.
This paper from Charities Evaluation Services provides an overview of the process and content needed to create an evaluation brief.
This guide consists of 2 powerpoint slides in a World Bank training module which list ethical issues that an external evaluator ought to address.
This toolkit focuses on the issue of data integration within mixed options research. The term ‘mixed options’ is used here to denote research that combines qualitative and quantitative data collection and analysis in one study. One of the main issues facing many mixed options researchers is the question of how to integrate data, with the particular problem of ‘contradictory’ data.
This comment, written by Dean Ornish and published on the Edge.org blog What scientific idea is ready for retirement, argues that larger studies do not always equate to more rigorous or definitive results and that randomized control trials (RCTs) may in fact introduce their own biases. The post goes on to argue that due to the issues that RCTs raise new and more thoughtful experimental designs and systems approaches need to be developed.
This 3ie working paper examines the extend to which impact evaluation methods can provide evidence to help improve the effectiveness and efficiency in humanitarian action.
This article by Dustin Welbourne and Will J Grant in The Conversation discusses ways to make a video about science popular and effective in its communication, highlighting a number of key features that are demonstrated through embedded examples.