Search

151 results

Filter search results
  • Principles-focused evaluation the GUIDE

    How can programs and organizations ensure they are adhering to core principles—and assess whether doing so is yielding desired results?
    Resource
  • Carrying out qualitative research under lockdown – Practical and ethical considerations

    "How can qualitative researchers collect data during social-distancing measures? Adam Jowett outlines several techniques researchers can use to collect data without face-to-face contact with participants.
    Resource
  • How can we use evaluation to support decision-making and reflection in this time of community crisis?

    A guide containing examples of supportive evaluation activities for organisations and leaders managing COVID-19 response efforts. This resource and the following information was contributed by Lauren Beriont.
    Resource
  • BetterEvaluation COVID-19 Statement

    The COVID-19 pandemic is rapidly transforming our world: Individuals, communities and organisations are facing enormous challenges and uncertainty.
    Blog
  • Adapting evaluation in the time of COVID-19 - Part 1: Manage

    Organisations around the world are quickly having to adapt their programme and project activities to respond to the COVID-19 pandemic and its consequences. We’re starting a new blog series to help support these efforts.
    Blog
  • Adapting evaluation in the time of COVID-19 — Part 3: Frame

    Evaluation needs to respond to the changes brought about by the Covid-19 pandemic.  As well as direct implications for the logistics of collecting data and managing evaluation processes, the pandemic has led to rapid changes
    Blog
  • Adapting evaluation in the time of COVID-19 – Part 4: Describe

    We’re continuing our series, sharing ideas and resources on ways of ensuring that evaluation adequately responds to the new challenges during the pandemic.
    Blog
  • Rapid evaluation

    Eleanor Williams is the Director of the Centre for Evaluation and Research Evidence at the Victorian Department of Health. In this role, she leads the department's evaluation and research strategy. 
    Blog
  • Evaluating the environmental impact of personal protective equipment (PPE) in the COVID-19 pandemic

    This Footprint Evaluation case study explores the feasibility and value of considering environmental sustainability in the evaluation of personal protective equipment (PPE) provisioning during the COVID-19 pandemic.
    Resource
  • Adapting evaluation in the time of COVID-19 — Part 2: Define

    The Covid-19 pandemic has led to rapid changes in the activities and goals of many organisations, whether these relate to addressing direct health impacts, the consequential economic and social impacts or to the need to change the way thing
    Blog
  • Multiple lines and levels of evidence

    Multiple lines and levels of evidence (MLLE) is a systematic approach to causal inference that involves bringing together different types of evidence (lines of evidence) and considering the strength of the evidence in terms of different ind
    Method
  • Journals and logs

    Journals and logs are forms of record-keeping tools that can be used to capture information about activities, results, conditions, or personal perspectives on how change occurred over a period of time.
    Method
  • Use of administrative data for the COVID-19 response

    This blog introduces a video of a panel session describing how administrative data – routinely collected data – might be used to help with the response to the COVID 19 pandemic.
    Resource
  • Integrity

    Integrity refers to ensuring honesty, transparency, and adherence to ethical behaviour by all those involved in the evaluation process.
    Method
  • Cultural competency

    Cultural competency involves ensuring that evaluators have the skills, knowledge, and experience necessary to work respectfully and safely in cultural contexts different from their own.
    Method
  • Feasibility

    Feasibility refers to ensuring that an evaluation can be realistically and effectively implemented, considering factors such as practicality, resource use, and responsiveness to the programme's context, including factors such as culture and
    Method
  • Inclusion of diverse perspectives

    Inclusion of diverse perspectives requires attention to ensure that marginalised people and communities are adequately engaged in the evaluation.
    Method
  • Independence

    Independence can include organisational independence, where an evaluator or evaluation team can independently set a work plan and finalise reports without undue interference, and behavioural independence, where evaluators can conduct and re
    Method
  • Evaluation accountability

    Evaluation accountability relates to processes in place to ensure the evaluation is carried out transparently and to a high-quality standard.
    Method
  • Transferability

    Transferability involves presenting findings in a way that they can be applied in other contexts or settings, considering the local culture and context to enhance the utility and reach of evaluation insights.
    Method
  • Utility

    Utility standards are intended to increase the extent to which program stakeholders find evaluation processes and products valuable in meeting their needs.
    Method
  • Professionalism

    Professionalism within evaluation is largely understood in terms of high levels of competence and ethical practice.
    Method
  • Propriety

    Propriety refers to ensuring that an evaluation will be conducted legally, ethically, and with due regard for the welfare of those involved in it and those affected by its results.
    Method
  • Systematic inquiry

    Systematic inquiry involves thorough, methodical, contextually relevant and empirical inquiry into evaluation questions. Systematic inquiry is one of the guiding principles of the American Evaluation Association:
    Method
  • Transparency

    Transparency refers to the evaluation processes and conclusions being able to be scrutinised.
    Method
  • Ethical practice

    Ethical practice in evaluation can be understood in terms of designing and conducting an evaluation to minimise any potential for harm and to maximise the value of the evaluation.
    Method
  • Accuracy

    Accuracy refers to the correctness of the evidence and conclusions in an evaluation. It may have an implication of precision.
    Method
  • Accessibility

    Accessibility of evaluation products includes consideration of the format and access options for reports, including plain language, inclusive print design, material in multiple languages, and material in alternative formats (such as online,
    Method
  • Competence

    Competence refers to ensuring that the evaluation team has or can draw on the skills, knowledge and experience needed to undertake the evaluation.
    Method
  • Outcome harvesting

    Outcome Harvesting collects (“harvests”) evidence of what has changed (“outcomes”) and, working backwards, determines whether and how an intervention has contributed to these changes.
    Approach
  • 52 weeks of BetterEvaluation: Week 16: Identifying and documenting emergent outcomes of a global network

    Global voluntary networks are complex beasts with dynamic and unpredictable actions and interactions. How can we evaluate the results of a network like this? Whose results are we even talking about?
    Blog
  • Validation workshop

    A validation workshop is a meeting that brings together evaluators and key stakeholders to review an evaluation's findings.
    Method