Primary tabs

Show search hints
Did you mean
results main

Search results

  1. Impact evaluation

    Development Theme

    An impact evaluation provides information about the impacts produced by an intervention - positive and negative, intended and unintended, direct and indirect. This means that an impact evaluation must establish what has been the cause of observed changes (in this case ‘impacts’) referred to as causal attribution (also referred to as causal inference).

  2. Sustained and Emerging Impacts Evaluation (SEIE)

  3. Summer School Programme: Result-based M&E and Outcome and Impact Evaluation

    7th September, 2015 to 18th September, 2015

    As part of the 10th Annual Edition of the joint Bologna Centre for International Development / Department of Economics Summer School Programme on Monitoring and Evaluation, the programme's focus for the September 2015 modules is on Result-based Monitoring and Evaluation (first module) and Outcome and Impact Evaluation (second module). 

  4. What do we mean by ‘impact’?

    17th March, 2016

    International development is fixated with impact. But how do we know we’re all talking about the same thing?

  5. UNICEF Webinar 6: Comparative Case Studies


    What does a non-experimental evaluation look like? How can we evaluate interventions implemented across multiple contexts, where constructing a control group is not feasible?

    Webinar 6 on comparative case studies was presented by Dr. Delwyn Goodrick, with a Q&A session between the presenter and audience at the end.  It took place on Thursday, 27th of August, with a repeat session on Monday, 31st of August.

  6. SRA: Advanced evaluation: New thinking and choices in impact evaluation

    7th June, 2015
    United Kingdom

    For government and its agencies, the European Commission, the Lottery, and charitable Trusts, evaluation of impact has become a cornerstone in understanding the accountability and effectiveness of programmes and initiatives. In an environment where resources for such activity are often scarce, those tasked with designing and managing evaluations, find themselves confronted with confusing choices about ‘the right’ approaches and techniques. This course helps to demystify impact evaluation and help those commissioning and conducting evaluation make effective choices.

  7. I'm doing an impact evaluation: What evidence do I need? (#AES17 presentation slides)


    Are quantitative or qualitative methods better for undertaking impact evaluations? What about true experiments? Is contribution analysis the new 'state of the art' in impact evaluation or should I just do a survey and use statistical methods to create comparison groups?

    Determining one's plan for an impact evaluation occurs within the constraints of a specific context. Since method choices must always be context specific, debates in the professional literature about impact methods can at best only provide partial guidance to evaluation practitioners. The way to break out of this methods impasse is by focusing on the evidentiary requirements for assessing casual impacts.

  8. Choosing appropriate designs and methods for impact evaluation- Department of Industry, Innovation and Science


    The Department of Industry, Innovation and Science has commissioned this report to explore the challenges and document a range of possible approaches for the impact evaluations that the department conducts. Research for the project comprised interviews with key internal stakeholders to understand their needs, and a review of the literature on impact evaluation, especially in the industry, innovation and science context. That research led directly to the development of this guide. This research project is the first stage of a larger project to develop materials as the basis for building departmental capability in impact evaluation.

  9. How to Manage, Design, and Conduct Impact Evaluations- CLEAR-SHIPDET Center

    1st August, 2016 to 12th August, 2016

    This technical course introduces impact evaluation as a key instrument for determining project/program effectiveness, informing policy development and improving program designs. 

  10. Contribution Tracing

    11th July, 2016 to 13th July, 2016
    United Kingdom

    There is increasing emphasis placed by impact evaluation commissioners on assessing the contribution made by projects and programmes to changing people’s lives, commonly referred to as a ‘contribution claim’. It can be argued that current theory-based approaches fail to provide evaluators with guidance on the ‘right’ data to gather and the quality of that data in relation to a particular contribution claim. This course aims to guide evaluators to collect data which can help assess how strongly or weakly such data support contribution claims.