Search

Primary tabs

Show search hints
Did you mean
their of change

Search results

  1. UNICEF Webinar: Theory of Change

    Resource
    2015

    What is a Theory of Change? How is it different from a logframe? Why is it such an important part of an impact evaluation?

    The third impact evaluation webinar in this series focused on Theory of Change and took place on Wednesday 15th of April and Thursday 16th of April (repeat session). This webinar series is organized by the Office of Research – Innocenti and presented by evaluation experts from RMIT University, BetterEvaluationand 3ie throughout 2015.

  2. Impact evaluation

    Development Theme

    An impact evaluation provides information about the impacts produced by an intervention - positive and negative, intended and unintended, direct and indirect. This means that an impact evaluation must establish what has been the cause of observed changes (in this case ‘impacts’) referred to as causal attribution (also referred to as causal inference).

  3. UNICEF Webinar: Overview of Impact Evaluation

    Resource
    Overview
    2015

     We often talk about the importance of knowing the impact of our work, but how is impact measured in practice? What are the ten basic things about impact evaluation that a UNICEF officer should know?

  4. Impact evaluation: challenges to address

    Blog
    23rd January, 2015

    In 2015, we’re presenting "12 months of BetterEvaluation" - with blog posts focusing each month on a different issue.  This is the first in a series on impact evaluation, our focus for January. 

    In development, government and philanthropy, there is increasing recognition of the potential value of impact evaluation. There is dedicated funding available and specific initiatives to develop capacity for both commissioning and conducting impact evaluation, including supporting use of the findings.     

  5. UNICEF Webinar: Overview of Data Collection and Analysis Methods in Impact Evaluation

    Resource
    Overview
    2015

    What is the value of using mixed methods in impact evaluation? What methods and designs are appropriate for answering descriptive, causal and evaluative questions?

  6. UNICEF Webinar: Randomized Controlled Trials

    Resource
    2015

    What are the key features of an RCT? Are RCTs really the gold standard? What ethical and practical issues do I need to consider before deciding to do an RCT?

    The fifth webinar in this series is presented by international evaluation expert and former Executive Director of 3ie, Dr Howard White.

  7. Overview: Strategies for Causal Attribution

    Resource
    Guide
    2014

    This guide, written by Patricia Rogers for UNICEF, looks at the process of causal attribution with a particular emphasis on its use in impact evaluation.  The guide specifically focuses on the three broad strategies for causal attribution: estimating the counterfactual; checking the consistency of evidence for the causal relationships made explicit in the theory of change; and ruling out alternative explanations, through a logical, evidence-based process. 

    Also Available In: Français, Español
  8. I'm doing an impact evaluation: What evidence do I need? (#AES17 presentation slides)

    Resource
    Overview
    2017

    Are quantitative or qualitative methods better for undertaking impact evaluations? What about true experiments? Is contribution analysis the new 'state of the art' in impact evaluation or should I just do a survey and use statistical methods to create comparison groups?

    Determining one's plan for an impact evaluation occurs within the constraints of a specific context. Since method choices must always be context specific, debates in the professional literature about impact methods can at best only provide partial guidance to evaluation practitioners. The way to break out of this methods impasse is by focusing on the evidentiary requirements for assessing casual impacts.

  9. Developing a research agenda for impact evaluation

    Blog
    13th February, 2015

    Impact evaluation, like many areas of evaluation, is under-researched. Doing systematic research about evaluation takes considerable resources, and is often constrained by the availability of information about evaluation practice.  Much of the work undertaken in evaluation is not readily visible (see  the recent comments by Drew Cameron on an earlier blog post which provide details about the considerable effort involved in a study of impact evaluations in development).  

  10. UNICEF Webinar: Quasi-experimental design and methods

    Resource
    2015

    What is the main difference between quasi-experiments and RCTs? How can I measure impact when establishing a control group is not an option?

    In the second last webinar of the series, Dr. Howard White of the International Initiative for Impact Evaluation (3ie), covers the basics of quasi-experiments.

Pages