What can be done to support the use of evaluation? How can evaluators, evaluation managers and others involved in or affected by evaluations support the constructive use of findings and evaluation processes?
We've now completed the first component of our user research - the user survey. Thank you for the helpful feedback from 50 different countries and from a wide range of users, including evaluators, people who sometimes do evaluation, evaluation managers and users, people involved in evaluation capacity strengthening, students, and others. (If you missed the chance, we're always pleased to get feedback through our contact form).
While there are many guidelines and tools to support those conducting evaluations, there are far fewer resources specifically focused on commissioners and managers of evaluation.
This guest blog is by Anne Markiewicz, Director of Anne Markiewicz and Associates, a consultancy that specialises in developing Monitoring and Evaluation Frameworks. Anne is the co-author, with Ian Patrick, of the text book ‘Developing Monitoring and Evaluation Frameworks’ (Sage 2016). She has extensive experience in the design and implementation of monitoring and evaluation frameworks for a wide range of different initiatives, building the capacity of organisations to plan for monitoring and evaluation.
Chris Lysy, of Lysy Design, (and also known as Fresh Spectrum's 'evaluation cartoonist'), recently made our day by storifying an example of a logic model Patricia Rogers had previously created for the UNICEF Impact Evaluation Series in Brief 2: Theory of Change. With a few simple changes, Chris has managed to turn a rather static diagram into something that is more visually appealing and understandable to stakeholders.
He's kindly let us share it with you here, and you can find the original post along with more of Chris' writing on data design on the Lysy Design website along with help in telling the story of your data if you need it.
We've received a lot of useful feedback since launching our user experience survey. Alongside the positive, we've gained insight into some of the difficulties experienced when using and navigating the BetterEvaluation website.
Thank you to everyone who has completed the survey! All of this feedback has been incredibly helpful for us to understand what's working well and what needs improvement.
L’objectif de BetterEvaluation est simple : améliorer la théorie et la pratique du suivi et de l’évaluation.
El objetivo de BetterEvaluation es simple: mejorar la práctica y la teoría del monitoreo y la evaluación
In this flipped conference session, we invite participants and evaluators, evaluation managers and evaluation capacity developers around the world to build and share knowledge about what can be done to support the use of evaluation findings after they've been reported.
Together we will explore these questions:
Q1. What are the different options for supporting the use of evaluation findings once reporting has been done?
We have all been there. You dive into a new book or head to a conference/workshop/course and come out all fired up about a new evaluation method. But when you get back to the real world, applying it turns out to be harder than you thought! What next?