The end of the year is nearly upon us and we're putting together a list. Help us spread some end-of-year cheer by sharing your favourite monitoring and/or evaluation resource that you've used this year - particularly if it's available freely online!
As part of a project with an Australian state government agency, I am developing a rubric for people with little to no evaluation skills who might need to judge the quality of an evaluation report. This is within the context of a larger project whereby an evidence base of past evaluation reports is made available for program designers. We want users to access these reports, but also have some support to judge the quality of the reports.
When you suggest to someone that you’re interested in using systems thinking in your project, you’ll often hear “we tried that once…” or “I like the idea of it but I’m not really sure how I would apply it” or “I get what it is, but I don’t really get what it is!” These are all comments that I have heard over the last three years when asked, “So what are you working on?”
In this guest blog, GPFE Secretariat members, Ada Ocampo and Asela Kalugampitiya, give us an overview of some of the highlights of the recent EvalColombo2018 event, a three-day forum hosted by that ran from 17th to 19th of September 2018 in Colombo, Sri Lanka, to promote demand and use of evaluation by parliamentarians through dialogue and exchange, generate innovative approaches to tackling them at a global level.
In this guest blog, Fran Demetriou (Lirata Consulting and volunteer M&E advisor for the Asylum Seeker Resource Centre’s Mentoring Program) shares her reflections from the recent Australasian Evaluation Society (AES)'s 2018 conference, held in Launceston - in particular, what are some of the lessons a young and emerging evaluator might take away from the event?
Development actors are embracing the concept and practice of adaptive management, using evidence to inform ongoing revisions throughout implementation. In this guest blog, Heather Britt, Richard Hummelbrunner and Jackie Greene discuss a practical approach that donors and partners can use to agree on what’s most important to monitor as a project continues to evolve.
In this guest blog, Jo Hall discusses how evidence of outcomes and impact can be better captured, integrated and reported on across different scales of work.
Evaluation reporting is important. While there are many innovative methods to grab your audience's attention, the evaluation report is still an important vehicle to get your key messages across. In this blog, Alice Macfarlan shares her tips for editing a report draft with an audience focus in mind.
In part 1 of this two-part blog series, Greet Peersman and Patricia Rogers introduce the ‘Pathways to advance professionalisation within the context of the AES’ project and report. They explore the four pathways identified in the report: 1) Ad hoc, disconnected activities; 2) Focused, connected and strategic activities; 3) Voluntary credentialing of evaluators; and 4) Regulated and licensed profession, and discuss their recommendation that the Australasian Evaluation Society follow a pathway of focused, connected and strategic activities, with a view to considering a voluntary credentialing process down the track.
In the previous blog in this series, Greet Peersman and Patricia Rogers introduced the ‘Pathways to advance professionalisation within the context of the AES’ project and report. A major feature of this report is the exploration of 41 activities and approaches that can be used to advance the professionalisation of monitoring and evaluation, and the conclusion of this two-part series looks at these approaches in more detail. We believe these activities are likely to be of considerable interest to others who are undertaking or planning evaluation capacity strengthening activities and we encourage you to share your feedback and thoughts on these activities at the end of this blog.