Search

155 results

Filter search results
  • The use of monitoring and evaluation in agriculture and rural development projects

    The document reviews monitoring and evaluation practises carried out in agricultural and rural development projects, financed by the World Bank.
    Resource
  • DIY M&E: A step-by-step guide to building a monitoring and evaluation framework

    This guide, written by Dana Cross of Grosvenor Management Consulting, gives an overview of how to create an M&E framework.
    Resource
  • Conference on Improving the use of M&E - Keynote speech by Marlene Läubli Loud

    This keynote presentation given by Marlene Läubi Loud at the CDI Conference 2014: Improving the use of M&E processes and findings presents the current state of affairs regarding the utilisation of M&E processes an
    Resource
  • Discussion Paper: Innovations in Monitoring and Evaluation

    This discussion paper produced by the United Nations Development Programme discusses various innovations that are occurring in M&E, and the advantages and disadvantages of these methods.
    Resource
  • How to build M&E systems to support better government

    This volume highlights the experience of several countries which have succeeded in building a well-functioning government M&E system, including Chile, Colombia and Australia.
    Resource
  • Some nuts and bolts questions about coding

    This guest blog by Helen Marshall springs from discussions of the Qualitative Interest Group (QIG) that Helen coordinates. QIG meets monthly in Melbourne Australia to discuss issues around researching with qualitative data.&n
    Blog
  • The logical framework approach

    This publication is part of a series of guidelines developed by AusAid in relation to activities design.
    Resource
  • O sistema de monitoramento e avaliação dos programas de promoção e proteção social do Brasil

    Apresentação do sistema brasileiro de monitoramento e avaliação de programas sociais, como parte da cooperação Brasil-Africa .
    Resource
  • Coding part 2: Thematic coding

    This video tutorial from Graham H Gibbs (2010) provides an overview of thematic coding and examples to demostrate how it is done and how codes can be applied to the data.
    Resource
  • Coding part 1: Alan Bryman's 4 stages of qualitative analysis

    In this web video, Graham R Gibbs provides an overview of qualitative analysis based on Alan Bryman's four stages of analysis.
    Resource
  • Analysing qualitative data using Microsoft Word

    In this slide show, Jenna Condie, who is presenting a Qualitative Methods in Psychology (QMiP) workshop, explains how Word supports detailed coding, including developing detailed definitions of the codes and tracking comments and emerging i
    Resource
  • Using Word & Excel to analyze qualitative data with Seth Tucker

    Extracting meaningful findings from qualitative data requires an evaluator to have the right tools to able to organize, code, and immerse themselves in the data.
    Resource
  • The art of coding with NVivo

    During this virtual workshop, Dr. Penna presented how NVivo, qualitative data analysis software is used to code data, document the data analysis process, and present a visual presentation of the results to increase credibility. 
    Resource
  • Multiple lines and levels of evidence

    Multiple lines and levels of evidence (MLLE) is a systematic approach to causal inference that involves bringing together different types of evidence (lines of evidence) and considering the strength of the evidence in terms of different ind
    Method
  • Journals and logs

    Journals and logs are forms of record-keeping tools that can be used to capture information about activities, results, conditions, or personal perspectives on how change occurred over a period of time.
    Method
  • Integrity

    Integrity refers to ensuring honesty, transparency, and adherence to ethical behaviour by all those involved in the evaluation process.
    Method
  • Cultural competency

    Cultural competency involves ensuring that evaluators have the skills, knowledge, and experience necessary to work respectfully and safely in cultural contexts different from their own.
    Method
  • Feasibility

    Feasibility refers to ensuring that an evaluation can be realistically and effectively implemented, considering factors such as practicality, resource use, and responsiveness to the programme's context, including factors such as culture and
    Method
  • Inclusion of diverse perspectives

    Inclusion of diverse perspectives requires attention to ensure that marginalised people and communities are adequately engaged in the evaluation.
    Method
  • Independence

    Independence can include organisational independence, where an evaluator or evaluation team can independently set a work plan and finalise reports without undue interference, and behavioural independence, where evaluators can conduct and re
    Method
  • Evaluation accountability

    Evaluation accountability relates to processes in place to ensure the evaluation is carried out transparently and to a high-quality standard.
    Method
  • Transferability

    Transferability involves presenting findings in a way that they can be applied in other contexts or settings, considering the local culture and context to enhance the utility and reach of evaluation insights.
    Method
  • Utility

    Utility standards are intended to increase the extent to which program stakeholders find evaluation processes and products valuable in meeting their needs.
    Method
  • Professionalism

    Professionalism within evaluation is largely understood in terms of high levels of competence and ethical practice.
    Method
  • Propriety

    Propriety refers to ensuring that an evaluation will be conducted legally, ethically, and with due regard for the welfare of those involved in it and those affected by its results.
    Method
  • Systematic inquiry

    Systematic inquiry involves thorough, methodical, contextually relevant and empirical inquiry into evaluation questions. Systematic inquiry is one of the guiding principles of the American Evaluation Association:
    Method
  • Transparency

    Transparency refers to the evaluation processes and conclusions being able to be scrutinised.
    Method
  • Ethical practice

    Ethical practice in evaluation can be understood in terms of designing and conducting an evaluation to minimise any potential for harm and to maximise the value of the evaluation.
    Method
  • Accuracy

    Accuracy refers to the correctness of the evidence and conclusions in an evaluation. It may have an implication of precision.
    Method
  • Accessibility

    Accessibility of evaluation products includes consideration of the format and access options for reports, including plain language, inclusive print design, material in multiple languages, and material in alternative formats (such as online,
    Method
  • Competence

    Competence refers to ensuring that the evaluation team has or can draw on the skills, knowledge and experience needed to undertake the evaluation.
    Method
  • Outcome harvesting

    Outcome Harvesting collects (“harvests”) evidence of what has changed (“outcomes”) and, working backwards, determines whether and how an intervention has contributed to these changes.
    Approach