This blog post from Sarah Plachta Elliott describes how a System of Supports and Opportunities was used by Brandeis University’s Center for Youth and Communities in six neighborhoods in Detroit, Michigan, to inform the work of the Skillman Foundation in improving the school and neighborhood conditions for children.
EventWebinar16th February, 2015 to 25th February, 2015OnlinePaid
Presented by Scott Chaplowe, this eStudy introduces six key planning steps for a successful monitoring and evaluation (M&E) system: 1) Identify the purpose and scope of the M&E system; 2) Plan for data collection and management; 3) Plan for data analysis; 4) Plan for information reporting and utilization; 5) Plan for M&E human resources and capacity building; 6) Prepare the M&E budget. This 6-step approach has been designed to guide programming at the community, regional and national levels. While informed by international programs/projects, it is also very appropriate for domestic (US) programs and projects – wherever M&E is needed for reliable and useful information and reporting to inform for program management and uphold performance accountability.
This practice paper from IDS captures lessons from recent experiences on using ‘theories of change’ amongst organisations involved in the research–policy interface.
The literature in this area highlights much of the complexity inherent in the policymaking process, as well as the challenges around finding meaningful ways to measure research uptake. As a tool, ‘theories of change’ offers much, but the paper argues that the very complexity and dynamism of the research-to-policy process means that any theory of change will be inadequate in this context. Therefore, rather than overcomplicating a static depiction of change at the start (to be evaluated at the end), incentives need to be in place to regularly collect evidence around the theory, test it periodically, and then reflect and reconsider its relevance and assumptions.
EventSeminar11th September, 2015AustraliaFree
RMIT's Centre for Applied Social Research presents this special panel discussion on taking your research to the public and making an impact. Producing a research report, book or journal article is all well and good, but often this isn’t enough to make an impact outside academia. We’ve asked Professor Patricia Rogers, BetterEvaluation Director, and a panel of senior researchers to discuss how they’ve gone beyond the publication to have their research talked about and listened to, and what frustrations this sometimes entails.
‘M&E on the Cutting Edge’ Conference- ‘Partnering for Success- How Monitoring and Evaluation can strengthen Partnerships for Sustainable Development’EventConference17th March, 2016 to 18th March, 2016NetherlandsPaid
This international conference is organised by the Centre for Development Innovation (CDI), Wageningen UR, and Learning by Design, in collaboration with the PPPLab. The two-day conference (17-18 March) will connect the realities of those working in practice with ideas from people who are thought leaders on Partnerships, Monitoring and Evaluation and Sustainable Development. Keynote presentations, paper presentations, workshops, panel discussions and plenary discussions will ensure a lively and thoughtful opportunity to question one’s own practice and find inspiration for new ideas. The programme includes more than 25 contributions from all over the world.
Blog2nd February, 2018
This is the second of a two-part blog on strategies to support the use of evaluation, building on a session the BetterEvaluation team facilitated at the American Evaluation Association conference last year. While the session focused particularly on strategies to use after an evaluation report has been produced, it is important to address use before and during an evaluation.
This report by DFID-ESRC Growth Research Programme (DEGRP) describes in detail how the researchers turned findings into recommendations, and how the various stages of stakeholder consultation influenced different elements of the project. It includes a step-by-step illustration of the stakeholder engagement process, which is also available a separate infographic.
The Psychology of Climate Change Communication: A Guide for Scientists, Journalists, Educators, Political Aides, and the Interested PublicResourceGuide2009
This guide by the Center for Research on Environmental Decisions, while focused on communicating research on climate change, will be useful for anyone interested in the theory behind communication and behaviour change and those who need to communicate evaluation results effectively to specific target audiences or the general public.
Blog13th March, 2020
Blog13th October, 2021
Evaluation use is a key issue for the evaluation community. The aim of evaluation is to be influential, so it should be of use to policymakers, programme developers, project planners and managers. I recently used a survey of evaluators to explore the concept of evaluation use, how evaluation practitioners view it and how this translates into their work – in other words, how evaluators are reporting and supporting evaluation use and influence.