Search

154 results

Filter search results
  • Coolors

    Coolors is a colour palette generator.
    Resource
  • Scaling impact: Innovation for the public good

    This book, written by Robert McLean (IDRC) and John Gargani (Gargani + Company), presents actionable principles that can help organizations and innovators design, manage, and evaluate scaling strat­egies.
    Resource
  • The advocacy iceberg - episode 1: the value iceberg

    The pilot episode of this new podcast by Jim Coe features an interview with Rhonda Schlangen, co author with Jim of The Value Iceberg, a BetterEvaluation Discussion Paper about how the important elements of
    Resource
  • Excel for evaluation

    This website, created by Ann Emery, provides a series of short videos on using Microsoft Excel to analyze data.
    Resource
  • Paletton

    Paletton (formerly Color Scheme Designer) is an application which enables you to design a colour scheme for your documents and presentation and then check it against an accessibility tool to ensure your colour scheme is accessible for those
    Resource
  • Announcing the IDRC program managers' guide to evaluation and the GeneraTOR

    We’re excited to announce the launch of the BetterEvaluation and IDRC (International Development Research Centre) Program Managers’ Guide to Evaluation and GeneraTOR.
    Blog
  • Handbook on monitoring, evaluating and managing knowledge for policy influence

    This handbook from the Center for the Implementation of Public Policies Promoting Equity and Growth (CIPPEC) is designed to support research institutions develop mon
    Resource
  • IDRC strategic evaluation of capacity development: Doing things better?

    The paper from the International Development Research Centre (IDRC) analyses whether the results from capacity building projects have supported the organisation to achieve its mission.
    Resource
  • Regression discontinuity

    Regression Discontinuity Design (RDD) is a quasi-experimental evaluation option that measures the impact of an intervention, or treatment, by applying a treatment assignment mechanism based on a continuous eligibility index which is a varia
    Method
  • Tools for knowledge and learning: A guide for development and humanitarian organisations

    This tool kit presents entry points and references to the wide range of tools and methods that have been used to facilitate improved knowledge and learning in the development and humanitarian sectors.
    Resource
  • Knowledge management and organizational learning

    This article provides an overview of knowledge management and it's role in organisational learning.
    Resource
  • Quasi-experimental methods for impact evaluations

    This video lecture, given by Dr Jyotsna Puri for the Asian Development Bank (ADB) and the International Initiative for Impact Evaluation (3ie), demonstrates how the use of quasi-experimental methods can circumvent the challenge of creating
    Resource
  • Quasi-experimental design and methods

    This guide, written by Howard White and Shagun Sabarwal for UNICEF looks at the use of quasi-experimental design and methods in impact evaluation.
    Resource
  • UNICEF webinar: Quasi-experimental design and methods

    What is the main difference between quasi-experiments and RCTs? How can I measure impact when establishing a control group is not an option?
    Resource
  • Multiple lines and levels of evidence

    Multiple lines and levels of evidence (MLLE) is a systematic approach to causal inference that involves bringing together different types of evidence (lines of evidence) and considering the strength of the evidence in terms of different ind
    Method
  • Journals and logs

    Journals and logs are forms of record-keeping tools that can be used to capture information about activities, results, conditions, or personal perspectives on how change occurred over a period of time.
    Method
  • Integrity

    Integrity refers to ensuring honesty, transparency, and adherence to ethical behaviour by all those involved in the evaluation process.
    Method
  • Cultural competency

    Cultural competency involves ensuring that evaluators have the skills, knowledge, and experience necessary to work respectfully and safely in cultural contexts different from their own.
    Method
  • Feasibility

    Feasibility refers to ensuring that an evaluation can be realistically and effectively implemented, considering factors such as practicality, resource use, and responsiveness to the programme's context, including factors such as culture and
    Method
  • Inclusion of diverse perspectives

    Inclusion of diverse perspectives requires attention to ensure that marginalised people and communities are adequately engaged in the evaluation.
    Method
  • Independence

    Independence can include organisational independence, where an evaluator or evaluation team can independently set a work plan and finalise reports without undue interference, and behavioural independence, where evaluators can conduct and re
    Method
  • Evaluation accountability

    Evaluation accountability relates to processes in place to ensure the evaluation is carried out transparently and to a high-quality standard.
    Method
  • Transferability

    Transferability involves presenting findings in a way that they can be applied in other contexts or settings, considering the local culture and context to enhance the utility and reach of evaluation insights.
    Method
  • Utility

    Utility standards are intended to increase the extent to which program stakeholders find evaluation processes and products valuable in meeting their needs.
    Method
  • Professionalism

    Professionalism within evaluation is largely understood in terms of high levels of competence and ethical practice.
    Method
  • Propriety

    Propriety refers to ensuring that an evaluation will be conducted legally, ethically, and with due regard for the welfare of those involved in it and those affected by its results.
    Method
  • Systematic inquiry

    Systematic inquiry involves thorough, methodical, contextually relevant and empirical inquiry into evaluation questions. Systematic inquiry is one of the guiding principles of the American Evaluation Association:
    Method
  • Transparency

    Transparency refers to the evaluation processes and conclusions being able to be scrutinised.
    Method
  • Ethical practice

    Ethical practice in evaluation can be understood in terms of designing and conducting an evaluation to minimise any potential for harm and to maximise the value of the evaluation.
    Method
  • Accuracy

    Accuracy refers to the correctness of the evidence and conclusions in an evaluation. It may have an implication of precision.
    Method
  • Accessibility

    Accessibility of evaluation products includes consideration of the format and access options for reports, including plain language, inclusive print design, material in multiple languages, and material in alternative formats (such as online,
    Method
  • Competence

    Competence refers to ensuring that the evaluation team has or can draw on the skills, knowledge and experience needed to undertake the evaluation.
    Method