Análise de Contribuição (Contribution Analysis) é uma abordagem para avaliar questões causais e inferir causalidade em avaliações de programas reais. Este recurso oferece uma abordagem passo-a-passo projetado para ajudar gerentes, pesquisadores e formuladores de políticas a chegar a conclusões sobre a contribuição que seu programa fez (ou está fazendo) para resultados específicos. O valor essencial da análise de contribuição é que ele oferece uma abordagem destinada a reduzir a incerteza sobre a contribuição que a intervenção faz, a partir dos resultados observados, através de uma maior compreensão do porquê os resultados observados ocorreram (ou não!) e os papéis desempenhados pela intervenção e outros fatores internos e externos.
An institutional history (IH) is a narrative that records key points about how institutional arrangements – new ways of working – have evolved over time and have created and contributed to more effective ways to achieve project or programme goals. An IH is generated and recorded in a collaborative way by scientists, farmers and other stakeholders. A key intention behind institutional histories is to introduce institutional factors into the legitimate narrative of success and failure in research organizations.
Horizontal evaluation is an approach that combines self-assessment by local participants and external review by peers. Originally developed to evaluate new methodologies for agricultural research and development, horizontal evaluation has wider potential for application. In its original setting, the focus of horizontal evaluation is the actual R&D methodology itself rather than the project per se or the team or organization that developed it.
The involvement of peers neutralizes the lopsided power relations that prevail in traditional external evaluations, creating a more favourable atmosphere for learning and improvement.
The central element of any horizontal evaluation is a professionally-facilitated, three-day workshop that includes all of the steps and processes essential to this approach. The workshop brings together a group of 10-15 ‘local participants’ who are developing a new R&D methodology and a similar-sized group of ‘visitors’ or ‘peers’ who are also interested in the methodology. The workshop is organized from start to finish by a small group known as the “workshop organizers” (a sub-group of the local participants). It combines presentations about the methodology with field visits, small group work and plenary discussions. It elicits and compares the perceptions of the two groups concerning the strengths and weaknesses of the methodology.
The perceptions of the groups are captured in an evaluation matrix which is a key tool in this approach. The matrix is used to collect data on a limited, pre-agreed upon number of relevant, highly focused criteria by site. The analysis phase provides practical suggestions for improvement which arise out of the strengths and weaknesses observed by the ‘peer group’ and discussed in the workshop. These recommendations are intended to be put to use immediately and so Horizontal Evaluation is essentially formative/developmental (TAG).
The processes employed during the workshop and field-visits serve to promote social learning among the different groups involved. Experience to date suggests the approach stimulates further experimentation with and development of the methodology in other settings. The authors believe that horizontal evaluation can be applied in different types of projects and programmes and is especially suited to those that operate in a multi-site, network mode.
The Increasing Participation in Evaluation bulletin was developed by Anita Baker with Beth Bruner to help organizations integrate evaluative thinking into their organizational practice. This three page bulletin discusses how Organization Staff, Evaluators, and Funders are typically involved in participatory evaluation.
This module, produced by Catholic Relief Services (CRS), and the American Red Cross, provides readers with information that helps private voluntary organization staff facilitate learning among individuals, groups, and organizations by communicating and reporting evaluation processes and findings more effectively.
This presentation and paper from the Bruner Foundation guides the reader through the evaluation process and provides a step by step process for commissioning an evaluation; clarifying the differences between research and evaluation; what evaluations should cost; and further things that need to be considered before implementation.
This example of a press release discusses the release of a voluntary statewide survey about patients’ experiences with inpatient care at Massachusetts hospitals.