Search

Primary tabs

Show search hints
Did you mean
results make

Search results

  1. Summer School Programme: Result-based M&E and Outcome and Impact Evaluation

    Event
    Course
    7th September, 2015 to 18th September, 2015
    Italy
    Paid

    As part of the 10th Annual Edition of the joint Bologna Centre for International Development / Department of Economics Summer School Programme on Monitoring and Evaluation, the programme's focus for the September 2015 modules is on Result-based Monitoring and Evaluation (first module) and Outcome and Impact Evaluation (second module). 

  2. UNICEF Webinar: Quasi-experimental design and methods

    Resource
    2015

    What is the main difference between quasi-experiments and RCTs? How can I measure impact when establishing a control group is not an option?

    In the second last webinar of the series, Dr. Howard White of the International Initiative for Impact Evaluation (3ie), covers the basics of quasi-experiments.

  3. I'm doing an impact evaluation: What evidence do I need? (#AES17 presentation slides)

    Resource
    Overview
    2017

    Are quantitative or qualitative methods better for undertaking impact evaluations? What about true experiments? Is contribution analysis the new 'state of the art' in impact evaluation or should I just do a survey and use statistical methods to create comparison groups?

    Determining one's plan for an impact evaluation occurs within the constraints of a specific context. Since method choices must always be context specific, debates in the professional literature about impact methods can at best only provide partial guidance to evaluation practitioners. The way to break out of this methods impasse is by focusing on the evidentiary requirements for assessing casual impacts.

  4. Sustained and Emerging Impacts Evaluation (SEIE)

  5. UNICEF Webinar 6: Comparative Case Studies

    Resource
    2015

    What does a non-experimental evaluation look like? How can we evaluate interventions implemented across multiple contexts, where constructing a control group is not feasible?

    Webinar 6 on comparative case studies was presented by Dr. Delwyn Goodrick, with a Q&A session between the presenter and audience at the end.  It took place on Thursday, 27th of August, with a repeat session on Monday, 31st of August.

  6. Broadening the range of designs and methods for impact evaluations

    Resource
    Overview
    2012

    The working paper, written by Elliot Stern, Nicoletta Stame, John Mayne, Kim Forss, Rick Davies, Barbara Befani for the UK Department for International Development (DFID)describes how Theory based, Case-based and Participatory options can be used in impact evaluations. These designs show promise to reinforce existing IE practice, including Experimental and Statistical designs, when dealing with complex programmes.

  7. Impact evaluation

    Development Theme

    An impact evaluation provides information about the impacts produced by an intervention - positive and negative, intended and unintended, direct and indirect. This means that an impact evaluation must establish what has been the cause of observed changes (in this case ‘impacts’) referred to as causal attribution (also referred to as causal inference).

  8. Impact Evaluation A Guide for Commissioners and Managers

    Resource
    Guide
    2015

    This guide, written by Elliot Stern builds on an initial report prepared for the UK Department for International Development (DFID) Broadening the range of designs and methods for impact evaluations. The impetus for this guide came from a ‘cross-funders group’ interested in helping decision-makers within civil society organisations and those that fund them to better understand how to commission, manage and use impact evaluations. 

  9. Developing a research agenda for impact evaluation

    Blog
    13th February, 2015

    Impact evaluation, like many areas of evaluation, is under-researched. Doing systematic research about evaluation takes considerable resources, and is often constrained by the availability of information about evaluation practice.  Much of the work undertaken in evaluation is not readily visible (see  the recent comments by Drew Cameron on an earlier blog post which provide details about the considerable effort involved in a study of impact evaluations in development).  

  10. Choosing appropriate designs and methods for impact evaluation- Department of Industry, Innovation and Science

    Resource
    Guide
    2016

    The Department of Industry, Innovation and Science has commissioned this report to explore the challenges and document a range of possible approaches for the impact evaluations that the department conducts. Research for the project comprised interviews with key internal stakeholders to understand their needs, and a review of the literature on impact evaluation, especially in the industry, innovation and science context. That research led directly to the development of this guide. This research project is the first stage of a larger project to develop materials as the basis for building departmental capability in impact evaluation.

Pages