Cover image for IPDET 2024 On-Site program

IPDET 2024 On-Site program

Event organiser:
Start date:
End date:
In person

The IPDET 2024 On-Site program will take place in Bern, Switzerland, from July 15 to 26, 2024!

Participants could attend the entire two weeks or just parts:

  • Week 1, Core Course: July 15 – 19, 2024
  • Week 2, Workshops: July 22 – 26, 2024


Week 1: Core Course

IPDET's one-week Core Course provides the fundamentals of evaluation. As an introductory course, participants gain a comprehensive overview of evaluation and how to do it. The course is designed for those with no or little prior experience in evaluation, as well as for those seeking to refresh and update their basic knowledge.

In 7 modules, the participants learn about the key concepts and approaches, methods, processes, standards and tools for planning, implementation, and utilization of evaluation. The interactive learning environment combines theory and practice through modern didactics.

Image provides additional details about week 1; Scholarships are available with a deadline of April 5. Course tutors are Candice Morkel, Wolfgang Meyer, Reinhard Stockmann & Stephanie Kapp


Week 2: In-depth workshops

The second week provides two sessions of in-depth workshops and allows participants to customize their schedules according to their backgrounds, particular needs, and interests.

  • Session one: July 22 – 24, 2024
  • Session two: July 25 – 26, 2024

There will be 6 parallel workshops in each Session. Therefore, only one workshop can be booked per Session.

Session 1, week 2 | July 22 - 24 | 2.5 days long

I-A: Quantitative Data Collection: Planning and Implementing Surveys

How do you plan and implement a survey? What are the challenges and pitfalls and how can you overcome them? In this workshop, you will enter the world of quantitative data collection. You will learn how to set up a survey, design a questionnaire, and yield valid and reliable results. You will be provided with basic method knowledge, real-world examples and hands-on practical group works. So, if you plan to conduct a survey anytime soon, be sure to check out this workshop!

Instructor: Stefan Silvestrini

Recommended for: Commissioners, Evaluators, Policy Makers

Level: Consolidation

I-B: Designing Evaluations

An evaluation is an assessment of a policy, program or project and serves the purposes of learning and accountability. To achieve these purposes, evaluations need to be purposely designed to enhance their potential utility while at the same time ensuring rigor. This workshop will discuss the foundations of evaluation design. Participants will develop a sound understanding of the building blocks of evaluation design on the basis of structured discussions and exercises around real-world evaluation examples.

Instructor: Jos Vaessen

Recommended for: Evaluators, Management, Practitioners

Level: Consolidation

I-C: Quantitative Impact Evaluation

Are experimental methods the gold standard of evaluation? Are there other valid and useful methods to answer questions about impact? What does that mean and why should we care? If you are interested in impact evaluation, but too afraid (of math and statistical formulae) to ask, this is the workshop for you.

Instructor: Claudia Maldonado

Recommended for: Activists, Commissioners, Evaluators, Policy Makers, Practitioners

Level: Intermediate

I-D: Theory-based Causal Analysis: the Basics of Process Tracing and QCA

This workshop covers the following topics: review of various causal theories and their affiliated theory-based evaluation methods; fundamentals of designing a theory-based case-based causal analysis; applications using within case causal analysis such as Process Tracing; and applications to enhance generalizability of causal claims through cross-case causal analysis (including brief introduction of QCA).

Instructor: Estelle Raimondo

Recommended for: Evaluators, Researchers

Level: Advanced

I-E: From Data to Decisions: Integrating Machine Learning in Evaluation

The workshop introduces the fundamentals of integrating big data science and machine learning algorithms into your evaluation approach. You'll learn about Bayesian theory, predictive and prescriptive analytics, and how to address selection and algorithmic bias. You will be guided through an interactive step-by-step process of building evaluation models with primary and secondary datasets using an open-source, no-cost, no-code visual-based analytics platform.

Instructor: Peter York

Level: Introductory

I-F: Evaluation at the Nexus of Environment and Development

For evaluation to remain relevant in today’s world, it must recognize the interconnectedness of the human and natural systems. As evaluators, we must see the larger picture in which the interventions we evaluate operate. We can no longer evaluate projects and programs in isolation, based only on their internal logic, as if they existed in a vacuum.

Instructor(s): Juha Uitto, Geeta Batra, Anupam Anand

Recommended for: Activists, Commissioners, Evaluators, Parliamentarians, Policy Makers, Practitioners

Level: Intermediate

Session 2, week 2 | July 20 - 21 | 2 days long

II-A: Introduction to Quantitative Data Analysis

This hands-on workshop will allow you to present your evaluation findings in a professional, convincing, and pleasing manner using tools such as descriptive statistics, bivariate analysis, ANOVA, regression analysis and data visualization. This workshop will introduce you to evidence-based decision-making tools allow you to demonstrate whether the program you evaluated works confidently.

Instructor: Bassirou Chitou

Recommended for: Evaluators, Commissioners, Policy Makers

Level: Consolidation

II-B: From Insights to Influence – Communication Strategies to Amplify Evaluation Utility

Are you tirelessly crafting evaluation reports, only to see them gather dust without making a real impact? They lack the narrative and creative spark that ignites action and influence. Here, you'll learn to morph your data into compelling stories. We'll equip you with innovative storytelling techniques and communication strategies tailored to various stakeholders. Our goal? To ensure your evaluation findings don't just inform but are utilized to inspire action and meaningful change.

Instructor(s): Ann-Murray Brown

Recommended for: Commissioners, Evaluators, Management, Policy Makers, Practitioners, Researchers

Level: Consolidation

II-C: Developing Monitoring and Evaluation Systems in Organizations

An M&E system within an organization aims at ensuring the quality of the work of the organization and to increase its overall steering capability. The workshop covers the necessary requirements, like M&E policy, a structural anchoring of M&E, a regulatory framework, quality assurance, a budget, qualified personnel, defined mechanisms for stakeholder participation and for the use of evaluation results.

Instructor(s): Reinhard Stockmann, Wolfgang Meyer

Recommended for: Management, Practitioners, Commissioners, Evaluators

Level: Introductory

II-D: Evaluating Humanitarian Action: Steps, Challenges, and Real-time Learning

What sets apart evaluations of humanitarian action? This workshop offers an overview of the critical steps and challenges in evaluating humanitarian action and how to address them; a deep dive into real-time learning-oriented evaluation, using case studies from Ukraine and elsewhere; practitioner insights into the ethical aspects of humanitarian evaluation; and cross-cutting considerations for evaluations, including accountability to affected persons and localization.

Instructor(s): Dmytro Kondratenko, Margie Buchanan-Smith, Hana Abul Husn

Recommended for: Commissioners, Evaluators, Management, Practitioners

Level: Introductory to intermediate

II-E: Designing and Managing Projects For Results and Evaluability

What if we can build an “evidence eco-system” around our projects and programs to make them more effective, and our evaluations more useful and insightful? Evidence to support evaluations should be identified as early as the project design stage and evolve throughout the project life cycle. In the workshop, participants will better understand how an integrated approach to building an evidence-based and results-orientated design is critical for project success and more effective evaluations.

Instructor(s): Stephen Pirozzi, Kamal Siblini

Recommended for: Commissioners, Evaluators, Practitioners

Level: Introductory to intermediate

II-F: Evaluation of climate change and development

Based on current concepts, frameworks, practical examples and interactive group work, this workshop covers methods and tools for evaluating climate finance and development impact, including evidence syntheses, advanced case studies analysis, vulnerability assessments and geospatial evaluation.

Instructor(s): Sven Harten, Martin Noltze, Alexandra Köngeter

Recommended for: Activists, Commissioners, Evaluators

Level: Intermediate

Image summarises information about week 2 course shown on page above. It additionally shows photos of the tutors and states that scholarships are available until April 5
Last updated