Search
7 results
Filter search resultsChoosing appropriate designs and methods for impact evaluation - Department of Industry, Innovation and Science
The Department of Industry, Innovation and Science has commissioned this report to explore the challenges and document a range of possible approaches for the impact evaluations that the department conducts.ResourceWeek 19: Ways of framing the difference between research and evaluation
One of the challenges of working in evaluation is that important terms (like ‘evaluation’, ‘impact’, ‘indicators’, ‘monitoring’ and so on ) are defined and used in very different ways by different people.BlogSemana 19: Formas de descrever a diferença entre pesquisa e avaliação
Um dos desafios em trabalhar em avaliação é que importante termos (como "avaliação", "impacto", "indicadores", "monitoramento" e assim por diante) são definidos e usados de maneiras muito diferentes, porBlogWeek 29: Evaluation design and unintended consequences or, from firefighting to systematic action
This week’s blog is from Jonny Morell, editor of Evaluation and Program Planning and author ofBlogBetterEvaluation community's views on the difference between evaluation and research
In May we blogged about ways of framing the difference between research and evaluation. We had terrific feedback on this issue from the international BetterEvaluation community and this update shares the results.BlogUser feedback on the difference between evaluation and research
This page contains thoughts from the BetterEvaluation community provided in response to the blog post onBlogNegative programme theory
Most programme theories, logic models and theories of change show how an intervention is expected to contribute to positive impacts; Negative programme theory, a technique developed by Carol Weiss, shows how it might produce negative impactMethod