Week 36: Systems thinking
This is #2 in our series on visionary evaluation. This year’s AEA Conference theme is visionary evaluation – systems thinking, equity and sustainability. Which begs the question what is systems thinking?
There are dozens of definitions but for me it is the combination of three things:
- Understanding inter-relationships
- Engaging with multiple-perspectives
- Reflecting on boundary choices
…. And how does that contribute to visionary evaluation?
Because it has the ability to change how we do evaluation and indeed what we think evaluation is about. Let’s take those three ideas in turn. Evaluators use the idea of inter-relationships a lot, but often in a relatively limited way. Take your classic program logic. Evaluators generally worry and argue about what’s in the boxes and tend to ignore the arrows between them. In contrast the systems and complexity field tend to focus more what the arrows mean and rather less on what’s in the boxes. Evaluators talk a lot about multiple perspectives, but do we really deeply engage with the consequences of those perspectives on the situations we evaluate? If we did then we’d never consider an intervention having a single purpose or single framing. Yet this is something which our program logics and theories of change nearly always do. Finally, if we assume that boundaries distinguish between what is important and what is unimportant, then boundary choices are essentially about what is valued. If we commonly reflected on and critiqued boundary choices then we’d never allow the values by which an intervention is judged to be determined solely by the program implementer or the evaluation client. These are all big issues for evaluation to engage with and has the potential to change what we do quite fundamentally.
What’s the first thing an evaluator should do when trying to think more systemically
Treat the systems and complexity field with the respect it deserves. It’s a big field and like the evaluation field has diverse methods and methodologies, big unresolved disputes and a history. Do your homework and avoid grabbing hold of simple clichés.
Simple clichés like?
Clichés like systems approaches are about including everything. That’s clearly impossible and will lead to worse evaluation practice not better evaluation practice. Every endeavour is limited in some way – hence the focus on boundaries. So ‘holism’ for me is about being very smart, very informed and very considered about what to leave out, rather than opening the floodgates to more stuff. Another cliché is that systems approaches are only about big things. I frequently hear people talk about ‘systems change’ only in terms of large entities. It’s a notion that comes primarily from the management field rather than the systems field. Something as small as a cell can be considered a system. The final cliché is that while systems approaches help us deal with ambiguous and uncertain situations, the way we understand situations and why they behave the way they do is not magic, it’s not ‘stuff happens’. Systems and complexity approaches are very disciplined approaches to making sense of how things happen the way they do.
You spoke earlier about the systems and complexity field being large and full of many methods. If evaluators want to think systemically how do they choose which ones to use?
That’s an important question. If you rephrase it in terms of evaluative thinking you can also see how difficult it is to answer. Yes of course there are some great systems and complexity methods out there and I know ones that could be particularly helpful to evaluators. But any method takes time to learn and apply well. So in the first instance I’d prefer to see evaluators start where they are now and use the methods they already know in more systemic ways. Once they’ve got the hang of that, then they can gain the full benefit from learning specific systems methods – and they are likely to learn faster and make fewer mistakes.
So how do evaluators make their current methods more systemic?
Easy. Improve those methods’ approaches to understanding inter-relationships, engaging with multiple perspectives and reflecting on boundary choices.
Perspectives from others
Universidad Nacional de San Juan & National Research Council of Science and Technology (CONICET), Argentina.
Bob's thoughts are both kind to and provocative for the field of evaluation - and evaluators themselves. I say kind because his text shines matured thoughts in the practice and reflection on evaluative thinking. And they are provocative as they uncover some of the less publicised facets of evaluators. For example, by introducing the three things that, for him, define systems thinking, Bob insists that evaluators use and talk a lot about these key ideas. However, in some/many cases they seem to pay lip service to them, without delving deeply into its contents, and its consequences. Hence the admonition "Do your homework and avoid grabbing hold of mere clichés" is a quite pertinent one, in order to warn us of the risk of converting systemic thinking into a (new) buzzword, peppering speeches and reports with fashionable terms, without sinking the teeth deep into the bone.
While the recommendation that evaluators must start where they are and use their known methods in a more systemic way is really sensitive, it is particularly provocative when Bob considers "simple" that evaluators can make their methods more systemic (see his reply on the end). Can every method be turned more systemic by paying attention to the three dimensions mentioned? Can we think on “systematic reviews” of system thinking evaluations? How could evaluations in simple, complicated and complex environments benefit differently from systemic thinking?
Dear Bob, we need you to blog frequently ;-)
Sheila B Robinson
Grant Coordinator, Greece Central School District and Adjunct Professor, University of Rochester, Rochester, NY.
Bob makes excellent points here, and offers sound advice for evaluators. While I find systems thinking a fascinating area of study, I don’t think you need to be a systems expert to incorporate elements of systems thinking into your evaluation practice. That said, I think it behooves an evaluator to devote some time to learning the basics.
Understanding interrelationships and reflecting on boundary choices are two areas Bob emphasizes. Recognizing that all human service programs and policies are parts of open systems with interrelationships probably more complex than we think, and boundaries probably less easily identified and defined than we think helps me to challenge my own assumptions, push others’ thinking when we’re working together, and, as Bob suggests, focus on the arrows between the boxes. For me, it involves the recognition that there is always more to the story, and the humbling acceptance that I may never be privy to the whole story (nor may anyone else necessarily). I’m reminded of a quote from the late Donnella Meadows, author of Thinking in Systems, A Primer (2008): “We know a tremendous amount about how the world works, but not nearly enough. Our knowledge is amazing; our ignorance even more so” (p. 87).
For me, thinking systemically means using evaluative thinking, and all it entails (for a brief introduction to evaluative thinking, read Tom Archibald and Jane Buckley on Evaluative Thinking: The ‘Je Ne Sais Quoi’ of Evaluation Capacity Building and Evaluation Practice.) It means valuing and questioning evidence, engaging in rich dialogue (and often this means having difficult conversations) with colleagues about why we might be seeing what we are seeing in our data, and figuring out where in the system we may look for elements of the story we are trying to construct from an evaluation of a program. It means asking a question I learned from studying developmental evaluation with Michael Quinn Patton. Instead of just asking what works? ask, What works, for whom, and under what conditions? You have to think systemically to be able to answer a question like that with any degree of veracity. As Bob says, it’s about “making sense of how things happen the way they do.” I think that all too often, we tend to stop short at what happened?
Bob urges us to engage multiple perspectives and while I think many of us certainly do attempt this in our evaluation work (especially if we’re using participatory, collaborative, or empowerment approaches), I can't help but wonder if we can do a better job of it thinking in more systemic ways and reflecting on our boundary choices. Who might we at first consider an “outsider” to the system who could potentially be affected by an evaluation? I worked on an evaluation recently of a program serving primarily high performing high school students who traditionally pursue higher education. I would never have guessed at the beginning of that evaluation that an outcome of our work would be linking with a program that serves students traditionally underrepresented in higher education. Engaging with people who had experience in both programs who could offer broader perspectives than those only associated with the program under review is what it took to put the two together - to think of the two programs as part of the same system and ultimately to expand our boundary choices.
I love this poetic advice from Donella Meadows (from Thinking in Systems: A Primer) that also captures Bob’s main points:
Guidelines for Living in a World of Systems
- Get the beat of the system.
- Expose your mental models to the light of day.
- Honor, respect, and distribute information.
- Use language with care and enrich it with systems concepts.
- Pay attention to what is important, not just what is quantifiable.
- Make feedback policies for feedback systems.
- Go for the good of the whole.
- Listen to the wisdom of the system.
- Locate responsibility within the system.
- Stay humble – stay a learner.
- Celebrate complexity.
- Expand time horizons.
- Defied the disciplines.
- Expand the boundaries of caring.
- Don’t erode the goal of goodness (2008, p. 194).
You might almost think this came from an evaluation textbook!
This links the three fundamental systems thinking elements (perspectives, relationships and boundaries) to various systems methods,through a set of evaluation style questions. So if you are attracted to Question X, use systems approach Y.
This document was prepared for the 2008 innovation dialogue, Navigating Complexity, organised by Wageningen University, the Netherlands. It draws on the opening chapter of the book Williams B. Imam I. (2007) Systems Concepts in Evaluation - An Expert Anthology EdgePress/AEA Point Reyes CA.
'Week 36: Systems thinking' is referenced in: