Search
14 results
Filter search resultsDFAT design and monitoring and evaluation standards
These updated design, monitoring and evaluation standards from the Australian Government aim to "improve the quality and use of Design and M&E products, and to integrate evaluative thinking into everyday work".ResourceDr. Richard Kreuger on qualitative listening
In this interview in The Listening Resource blog (August 29th, 2013), Susan Eliot talks to Dr.ResourceMaking rigorous causal claims in a real-life context: Has research contributed to sustainable forest management?
This article presents an example of a rigorous non-counterfactual causal analysis that describes how different evidence and methods were used together for causal inference without a control group or comparison group.ResourceRQ+ Research Quality Plus. A Holistic Approach to Evaluating Research
This report describes a holistic approach and assessment framework for evaluating 'research' that goes beyond the traditional deliberative means (e.g., peer review) and often used analytics (e.g., bibliometrics).ResourceTranscribe
Transcribe is a very useful tool if you need to transcribe a small amount of data and don't have access to some of the more professional, downloadable transcription software packages.ResourceA guide for designing and conducting in-depth interviews for evaluation input
This short guide defines in-depth interviews, explains their advantages and disadvantages and the steps involved in their application.ResourceAction and reflection: a guide for monitoring and evaluating participatory research
This paper from the International Development Research Centre (IDRC) was designed to support those involved in participatory research and development projects with monitoring and evaluation strategies.ResourceWeek 34: Alternatives to transcribing interviews
Being able to compare alternatives is essential when designing an evaluation.BlogDeveloping a research agenda for impact evaluation
Impact evaluation, like many areas of evaluation, is under-researched. Doing systematic research about evaluation takes considerable resources, and is often constrained by the availability of information about evaluation practice.BlogInstitutional history
An institutional history (IH) is a narrative that records key points about how institutional arrangements – new ways of working – have evolved over time and have created and contributed to more effective ways to achieve project or programmeMethodAssessing the impact of research on policy
The authors of this review analyse various evaluation methods (including ethnographic and quantitative approaches, focus groups, process tracing, and network mapping and analysis) to find out which ones are the most suitable to evaluate theResourceProjects assuming responsibility over evaluation: Test-driving utilisation focused evaluation
This presentation from Developing Evaluation Capacity in ICT4D (DECI) outlines the objectives of the project and their use ofResourceMonitoring the composition and evolution of the research networks of the CGIAR research program on roots, tubers and bananas (RTB)
This Brief provides an example of how Social Network Analysis (SNA) can be used, in the context of agricultural research.ResourcePathways to advance professionalisation within the context of the AES
This report by Greet Peersman and Patricia Rogers for the Australasian Evaluation Society (AES) identifies four potential pathways towards professionalisation within the context of the AES. These pathways are as follows:Resource