When you suggest to someone that you’re interested in using systems thinking in your project, you’ll often hear “we tried that once…” or “I like the idea of it but I’m not really sure how I would apply it” or “I get what it is, but I don’t really get what it is!” These are all comments that I have heard over the last three years when asked, “So what are you working on?”
In this guest blog, GPFE Secretariat members, Ada Ocampo and Asela Kalugampitiya, give us an overview of some of the highlights of the recent EvalColombo2018 event, a three-day forum hosted by that ran from 17th to 19th of September 2018 in Colombo, Sri Lanka, to promote demand and use of evaluation by parliamentarians through dialogue and exchange, generate innovative approaches to tackling them at a global level.
In this guest blog, Fran Demetriou (Lirata Consulting and volunteer M&E advisor for the Asylum Seeker Resource Centre’s Mentoring Program) shares her reflections from the recent Australasian Evaluation Society (AES)'s 2018 conference, held in Launceston - in particular, what are some of the lessons a young and emerging evaluator might take away from the event?
Development actors are embracing the concept and practice of adaptive management, using evidence to inform ongoing revisions throughout implementation. In this guest blog, Heather Britt, Richard Hummelbrunner and Jackie Greene discuss a practical approach that donors and partners can use to agree on what’s most important to monitor as a project continues to evolve.
In this guest blog, Jo Hall discusses how evidence of outcomes and impact can be better captured, integrated and reported on across different scales of work.
Evaluation reporting is important. While there are many innovative methods to grab your audience's attention, the evaluation report is still an important vehicle to get your key messages across. In this blog, Alice Macfarlan shares her tips for editing a report draft with an audience focus in mind.
In part 1 of this two-part blog series, Greet Peersman and Patricia Rogers introduce the ‘Pathways to advance professionalisation within the context of the AES’ project and report. They explore the four pathways identified in the report: 1) Ad hoc, disconnected activities; 2) Focused, connected and strategic activities; 3) Voluntary credentialing of evaluators; and 4) Regulated and licensed profession, and discuss their recommendation that the Australasian Evaluation Society follow a pathway of focused, connected and strategic activities, with a view to considering a voluntary credentialing process down the track.
In the previous blog in this series, Greet Peersman and Patricia Rogers introduced the ‘Pathways to advance professionalisation within the context of the AES’ project and report. A major feature of this report is the exploration of 41 activities and approaches that can be used to advance the professionalisation of monitoring and evaluation, and the conclusion of this two-part series looks at these approaches in more detail. We believe these activities are likely to be of considerable interest to others who are undertaking or planning evaluation capacity strengthening activities and we encourage you to share your feedback and thoughts on these activities at the end of this blog.
This guest post from Caroline Heider (Director General and Senior Vice President, Evaluation, World Bank Group) was originally posted in the IEG's #WhatWorks blog. In this post, Caroline reflects on the motivations behind her 2017 Rethinking Evaluation blog series, which was dedicated to unpacking and debating the DAC evaluation criteria, and how this series links with the consultation process recently launched by the DAC Network on Development Evaluation to review the criteria. We've reposted this blog on BetterEvaluation because we believe that this is an important conversation to have, and we encourage our readers to get involved in the consultation process.
Evaluation frameworks are often developed to provide a common reference point for evaluations of different projects that form a program, or different types of evaluations of a single program. But getting agreement on a shared document is only the start of achieving the intended benefits of evaluation frameworks, such as reduced duplication and overlap, improved data quality, and ease of aggregation and synthesis. This guest blog by George Argyrous (ANZSOG) outlines 9 actions that can be taken to support the implementation of high-level monitoring and evaluation frameworks, and make sure these frameworks don't languish on a dusty shelf.