Last week, we started our focus on Adaptive Management with a blog post by Patricia Rogers that explored how monitoring and evaluation can support adaptive management. This week, we're continuing this series with a guest blog from Fred Carden and Arnaldo Pellini, in which they discuss what they learned about adaptive management in a major project on developing capacity for evidence-based policy.
One of our objectives for this Adaptive Management series is to revise the Decide Purpose task page in BetterEvaluation's Rainbow Framework, and perhaps add a new option of 'Support adaptive management" and we're looking to learn from your experience on this. If you'd like to be part of this process to co-create and share knowledge, please click the link at the end of the blog to connect with us. And of course, we welcome comments directly on the blog page too - we've posed a number of questions at the end of the blog and would love to hear your thoughts on these.
The Australia-Indonesia Partnership for Pro-Poor Policy: The Knowledge Sector Initiative (KSI) is a joint programme between the governments of Indonesia and Australia that seeks to improve the lives of the Indonesian people through better quality evidence-informed policy making. The programme is working with Bappenas (Indonesian Ministry of National Development Planning) to assist research institutions to improve the quality and relevance of their research; improve better communication of research results to inform public debate and policy making processes; identify and mitigate systemic barriers that limit interaction between knowledge production, intermediation, demand and use. The first phase of the programme started in May 2013 and will end in June 2017.
KSI is built on the hypothesis that greater use of research evidence improves public policy and does not have ready-made solutions to the problems in the evidence-based policy making space in Indonesia. It is a programme that has to find solutions that work through trial, errors and learning. In the following conversation between Fred Carden and myself, we discuss what learning means in an adaptive programme such as KSI.
Arnaldo Pellini I have been thinking lately about the meaning of Learning in a programme like KSI which aims to be problem-driven and adaptive. Recently Duncan Green wrote that ‘adaptive management, seems to be where the big aid agencies [and large programmes?] have found a point of entry into the whole ‘Doing Development Differently’ debate.’ There is a lot of talk around Learning in this attempt to find a different way to address ‘wicked hard problems’.
A couple of weeks ago I stumbled upon a quote by Denis Diderot ‘There are three principal means of acquiring knowledge... observation of nature, reflection, and experimentation. Observation collects facts; reflection combines them; experimentation verifies the result of that combination’. It made me think about the implementation of KSI and the process of transforming raw data from observations into knowledge and learning to be used by others. I also thought about the many meanings and interpretations that are given to Learning by all the individuals and stakeholders who gravitate around the KSI programme. I came to the conclusion that establishing and developing a learning function that satisfies everybody is almost impossible because the demand and needs and interest are very different. What do you think?
Fred Carden Interesting question. Let’s start from what KSI’s learning approach means. To look at that I will contrast it with the approach that was (appropriately) used in the design phase for KSI (back in 2010-2013) and which was assumed by many would continue to be central in how KSI operated. That did not turn out to be the case.
KSI was designed largely through the conduct of diagnostics to get a better handle on the issues and problems at play and help identify some key starting points and key actors in the knowledge sector. It did that well.
If we go back to our dictionaries, diagnosis is about identifying a problem, be it a disease, or an institutional or legal barrier. As such it is positivist; it is about getting clear on what is happening. Diagnosis is not treatment. Diagnosis is not about doing something to address what has been identified or clarified; it is merely about our best guess on what the problem actually is - not to dismiss or downgrade diagnosis, but simply situate what it does and does not do.
Doing a diagnosis is useful because as we learn our diagnosis changes. But we don’t learn by doing diagnosis. A diagnostic approach can identify some of the key problems and opportunities through a number of tests or studies. At some point, an agreement is reached that sufficient diagnosis has been carried out to identify the problem(s).
AP Interesting metaphor. So, the diagnosis is not an end in itself. It is a means to an end, that is to generate and accumulate evidence to better understand and unpack the problems a programme will face. It kick starts the process of thinking of possible solutions. You are also saying that diagnosis is more than an assessment done at the beginning of a programme, it is iterative.
FC Yes. In the case of the knowledge sector, the diagnostics mapped out a response to the diagnosis with the design of a fifteen-year years programme in three phases, that is KSI. Part of the diagnosis being that the problem in the Indonesian knowledge sector would take a long time to treat because of the multiple manifestations of the problem in different sectors and at different levels. The diagnosis did not suggest taking two aspirins and calling again in the morning. It said you would be at this a long time and suggested there would be many twists and turns along the way. It also suggested that diagnostic studies would continue throughout implementation of the programme. The diagnostics and experience also suggested that there was a need for much stronger ownership of the process in local institutions and organizations. So to move forward, KSI moves beyond diagnostics to the “doing” – and that is where the learning happens.
AP I can see a risk here, especially if researchers are involved in a programme, and that is that diagnostics are never enough. If they are iterative, they almost certainly raise new questions that can be studied further. In a sense, the diagnostic and learning that comes from it may never end.
FC I agree, the risk is there, but remember learning and better understanding does not come from diagnostic studies. Learning comes from doing:
Doing is the precursor to learning and learning is the precursor to developing a robust vision for the work to be done going forward.
We had excellent diagnosis to start with but we knew that a diagnosis could only peel back so many layers of the complex problems of the knowledge sectors; these are many, such as poor quality of research, low capability to demand evidence in government organisations, insufficient incentives for academicians to publish in international journals, etc. Learning comes also from testing things out, learning from what happens, revising and testing again. If we go back to our dictionary, learning is the activity or process of gaining knowledge and skills by study or practice - put another way, modification of a behavioral tendency by experience (as opposed to conditioning). As we learn we move into new territory, so some new diagnostics are needed, but always with an eye on how they help us move towards our objective.
AP Problems or solutions? Where to start and what does it mean for learning? I find these questions interesting on multiple levels: adaptive development, alternative development processes, monitoring and evaluation, knowledge management, etc. I recognize that I am biased and believe that it is important to start from problems first, with an open mind about what solutions can be. Starting from solution (i.e., expert solutions) has been one of the main problems that development has had in the last 40+ years, particularly on building state capability. But what you are saying is that it is also possible to start from solutions with an open mind. All of which is also conducive to learning.
Box 1: KSI saw the Indonesian Academy of Sciences (AIPI) as a foundational organization in the country’s knowledge sector. The programme set about building a relationship with AIPI and exploring potential interaction. We did not know where that would lead when we started, but we knew that a well-functioning Academy was part of the long-term solution because of its potential to provide policy advice based on the science. The initial meeting led us to identity two key problems: the lack of funding for science and the limited support to young scholars to be active in research. Testing solutions to these problems with AIPI has led to the establishment of the Indonesian Science Fund and ALMI (or Young Academy of Science). The ongoing work with AIPI focuses on strengthening the basic operating systems strengthening AIPI’s institutional relationships within Indonesia and internationally.
FC A learning approach starts from the position that we have some idea of what we think the problem is (diagnosis is problem-driven), and some ideas about what to do about it, some possible solutions. When I say ‘we’, I do not mean one group (such as a programme team) with one view and one set of ideas, but a number of different stakeholders who agree on the general parameters of the problem they want to solve and may want to test different ideas and different approaches. The path forward in a learning approach is to take some of these potential solutions and test them, rather than focus in the first instance on a redefinition of the problem. So I say solution-driven because the whole point is to experiment and find possible solutions. But these need to be based on evidence of what the problem seems to be. Too often research is happy with its diagnosis, but to me, the issue is how we use that diagnosis for change. An example is presented in Box 1.
It also means being open to the potential solutions of others and where it makes sense, integrating and adapting others’ ideas and approaches. As we test, we refine the problem and refine the solutions. Through testing and interaction with the problem, elements of it become clearer and different elements take on more importance than others, sometimes because they are more tractable and therefore help create momentum or a vision for the work to be done going forward, sometimes because we did not see them until we started the work. A learning approach is iterative, going back and forth between problem and potential solution. As we test things out we also redefine and refine our understanding of the problem. But the key point is that a learning approach doesn’t stay focused on only the problem, it goes to solutions.
Documenting our learning is the major activity in which KSI is engaged at the moment. The evaluation products such as Stories of Change and their synthesis, the KSI Performance Story, the organizational assessments of policy research institutes, the documentation of practices that have worked elsewhere all contribute to the learning that is informing directions for the next phase of the programme.
AP In a sense it is not so much a dichotomy between problems and solutions, rather the acknowledgement that a learning-based approach is the common denominator between problem analysis and solution testing.
FC A learning-based approach does not negate the need for periodic diagnostics. But the learning really happens when we try something out and we see what works, why and for who. We can use that learning to adjust, carry out new diagnostics or sometimes continue on the same path. It intends to be more partner driven and is focussed on identifying the mechanisms that will strengthen the knowledge sector over the long term. It involves trial and errors which brings us to the point that learning has opportunity costs. We have to recognize these, but I would argue that the resources spent on learning help ensure we waste much less of our precious programme resource. Some of the working papers that KSI has presented are diagnostic in nature, such as the partner review of university barriers led by Yanuar Nugroho. Some working papers present solutions that have been tested and tried out elsewhere and which could generate ideas for possible solutions in the context and with the partners with whom KSI works (e.g., Managing a Government Think Tank: Inside the Black Box; Investing in Evidence: Lessons from the UK Department for Environment, Food and Rural Affairs).
AP To conclude, it is not so much about whether to start from problems or solutions, in terms of acquiring learning that informs plans and actions. Both are good options.
FC Yes, that’s right. We need some thoughtful, locally-driven problem definition, but we also need to make some forays into possible solutions. I would add that what matters is also who speaks of the problems and who is suggesting the possible solutions. The KSI programme tries to give ownership and control of the analysis of problems and the identification of solutions to partners through core grants and other mechanisms.
Complex interventions are well… complex. There are no easy solutions and even understanding the problems can be challenges. As someone once defined complexity – ‘You don’t solve complex problems, but you know you are making progress if you like your new problems better than your old ones.’ The essence of a learning approach is continuous adaptation and evolution of problem and solution.
Let’s continue the conversation
- How relevant are these ideas for your work?
- How different are they from what you already do?
- What are some challenges in doing evaluation in ways that support adaptive management? How can they be overcome?
- Are there good examples of evaluation for adaptive management we can learn from? Or guidance?
We’d love to hear your thoughts on the questions posed above in the comments below. And if you'd like to be involved in the discussion further and help with the development of an Adaptive Management Option page, please register your interest and let us know what examples, advice, resources or questions you'd like to share.