Search
20 results
Filter search resultsHow to design an M&E framework for a policy research project
This Methods Lab guidance note focuses on the designing and structuring of a monitoring and evaluation framework for policy research projecRessource‘Context Matters’ framework for improving evidence use: what do policymakers and practitioners think about it?
This blog introduces the 'Context Matters' framework - a living tool that builds on and contributes to learning and thinking on evidence-informed policy making, by providing a lens through which to examine the context (internal and externalRessourceSuccessful public policy: Lessons from Australia and New Zealand
This book is a collection of 20 examples of successful public policies in Australia and New Zealand. It aims to reset the agenda for teaching, research and dialogue on public policy performance.RessourceUn-boxing evaluation through developmental and agile approaches
Guest author Nerida Buckley discusses how un-boxing evaluation can benefit from looking at practices from developmental and agile approaches.BlogBeyond the evaluation box – Social innovation with Ingrid Burkett
This blog is the sixth in our series about un-boxing evaluation – the theme of aes19 in Sydney.BlogWeek 19: Ways of framing the difference between research and evaluation
One of the challenges of working in evaluation is that important terms (like ‘evaluation’, ‘impact’, ‘indicators’, ‘monitoring’ and so on ) are defined and used in very different ways by different people.BlogSemana 19: Formas de descrever a diferença entre pesquisa e avaliação
Um dos desafios em trabalhar em avaliação é que importante termos (como "avaliação", "impacto", "indicadores", "monitoramento" e assim por diante) são definidos e usados de maneiras muito diferentes, porBlogBetterEvaluation community's views on the difference between evaluation and research
In May we blogged about ways of framing the difference between research and evaluation. We had terrific feedback on this issue from the international BetterEvaluation community and this update shares the results.BlogPathways to professionalisation - Part 1: Professionalisation within the context of the AES
In part 1 of this two-part blog series, greet Peersman and Patricia Rogers introduce the ‘Pathways to advance professionalisation within the context of the AES’ project and report.BlogPathways to professionalisation - Part 2: Options for professionalisation
In the previous blog in this series, greet Peersman and Patricia Rogers introduced the ‘Pathways to advance professionalisation within the context of the AES’ project and report.BlogAES 2018 conference reflections: Power, values, and food
In this guest blog, Fran Demetriou (Lirata Consulting and volunteer M&E advisor for the Asylum Seeker Resource Centre’s Mentoring Program) shares her reflections from the recent Australasian Evaluation Society (AES)'s 2018 conference,BlogWhat does it mean to ‘un-box’ evaluation?
This guest blog by Jade Maloney is the first in a series about un-boxing evaluation – the theme of aes19 in Sydney, Australia.BlogUn-boxing NGO evaluation
This blog is the fourth in our series about un-boxing evaluation – the theme of aes19 in Sydney, Australia.Blog4 tips for planning your policy research M&E
In this guest blog post, Tiina Pasanen, from the Overseas Development Institute (ODI), lays out four key ideas to keep in mind when designing an M&E framework for a policy research projectBlogWhat would an evaluation conference look like if it was run by people who know and care about presenting information to support use? (hint - that should be us)
All too often conferences fail to make good use of the experience and knowledge of people attending, with most time spent presenting prepared material that could be better delivered other ways, and not enough time spent on discussions and aBlogUser feedback on the difference between evaluation and research
This page contains thoughts from the BetterEvaluation community provided in response to the blog post onBlogThe rubric revolution
Three linked presentations from Jane Davidson, Nan Wehipeihana & Kate McKegg explaining how rubrics can be used to ensure evaluations validly answer evaluative questions.RessourceMaking a difference: M&E of policy research
The paper presents examples and approaches on conducting M&E of policy research from the current experience of a range of research institutes, think tanks and funding bodies.RessourceTools for policy impact: A handbook for researchers
The Overseas Development Institute (ODI), as part of its Research and Policy in Development (RAPID) programme, has been looking at the links between research and policy for several years.RessourcePublic impact fundamentals and observatory
The Public Impact Fundamentals are a framework developed by the Centre for Public Impact to assess what makes a successful policy outcome and describe what can be done to maximise the chances of achieving public impact.Ressource