Realist evaluation

Complexity theory and “invisible mechanisms”: Implications for methods and commissioning

This presentation continues the CECAN series on realist research and evaluation and their use in relation to complexity. It focuses on the issue of causation. Causation is of critical interest to policy and program authors, who seek to cause (or steer) change; to evaluators who seek to attribute outcomes to interventions, and to researchers who seek to understand particular aspects of the operations of complex systems.


2017 International Realist Conference

Better-Admin's picture 23rd August 2017 by Better-Admin

We've got our head in realism this week, partly because early-bird registrations for the 2017 International Realist Conference close soon (Early-bird registration deadline has been extended until Thursday, September), and partly because we've been shown Chris Lysy's realist cartoon series (commissioned by the Rameses project) which made us giggle. You can view the full series of cartoons on the Rameses website, along with a number of other great resources about realist evaluation, including Ray Pawson's video series.

The "Context+Mechanism" Association: Mastering a Key Heuristic in Realist Evaluation for Innovating Complex Programmes and Policy

This webinar is the second part of a series on Realist Methodology for the Centre for Evaluation of Complexity across the Nexus. Undertaking inquiry using the realist approach involves analyzing complexity in terms of context-mechanism-outcome configurations. Confusion often arises in determining what data should fit under 'context' or else 'mechanism' in the process of configuring. This webinar will provide working definitions for context, mechanism and outcome and will introduce of examples of data to be transformed into CMO configurations to exemplify how this can be done. The goal is to stimulate ideas around how to define concepts, theorize programmes and configure data in realist analysis, with the ultimate ambition of increasing capacity for using realist evaluation to assess and innovate programmes.    

4th Annual CARES Summer School for Realist Methodology Training

The annual CARES Summer School is a 4-day intensive for training in realist methodology (evaluation and synthesis). The programme is designed to assist participants advance their projects using the methodology and gain increased clarity in applying realist principles to complex areas of assessment. The application of realist methodology to research projects is a craft. For many of us, collective effort and co-learning is fundamental to developing an inspired capacity for undertaking this approach. 

2017 International Realist Conference

The 2017 Realist Conference invites realist researchers, evaluators, theorists and methodologists of all descriptions, along with those who are commissioning realist work and those who are using it to inform practice and policy together to answer: In what circumstances and for whom have realist methods been useful, in what respects, and why? In what contexts have they not proved useful, and why?  How do the specific methods we use in our research or evaluation contribute (or not) to their use? What new developments or methods would further support their use?


Evidence-Based Medicine & Evidence-Based Policy: The world’s most perfectly developed method & the 79-pound weakling?

Conventional narratives have honoured clinical (especially pharmaceutical) RCTs as the world’s most perfectly developed method. The quarrelsome, paradigm- heavy field of EBP is often dismissed as a seventy-nine pound weakling. This presentation seeks to tear up these storylines.

Oxford: Realist Reviews and Realist Evaluation

This module will provide participants with an understanding of realist review (or synthesis) and realist evaluation. Participants should then be able to apply their new knowledge and skills to their own realist research project, regardless of which field of research they come from. In health care and many other fields of research, interventions are often described as being complex and have outcomes that are dependent on context. When these complex interventions fail to achieve their desired outcomes, the explanation frequently provided is because they are both complex and context dependent. Realist research approaches (realist evaluation or realist review) can help make sense of these types of interventions or programmes. The approaches are theory driven – developing structurally coherent explanations of interventions and test these against empirical data. In realist evaluation the researcher or evaluator’s task is to gather the data – i.e. it is a form of primary research. Whereas in realist review the primary data comes from documents (e.g. studies, policy documents and so on) and so it is a form of secondary research.

Theory based evaluation for complex systems and Realist qualitative analysis

Many policies and programs are implemented in large systems, or expect to make changes at multiple levels of a system. Many approaches to program theory either assume that the program itself is simple, or ignore the implications of context for whether and how programs work.

The first day of this program will compare various approaches to 'systems', 'complexity' and 'context', including introducing realist perspectives. Participants will explore the implications for program design and for commissioning and conducting evaluations, and in particular, the many uses of theory for dealing with complexity.

Realist Evaluation Workshop

The focus of this one-day workshop was to build practical skills to conduct a realist evaluation for international development projects and programmes. The rapidly changing context of development assistance in recent years combined with growing pressure on policymakers to demonstrate value for money has led to criticism that many impact evaluation approaches lack rigour or fail to respond to this complex and shifting environment. Over the last decade this has led to a surge of interest in exploring alternative, yet still robust, approaches to impact evaluation.