Willy Pradel and Gordon Prain from the International Potato Centre in Lima, Peru and Donald Cole from the University of Toronto discuss the evaluation they recently conducted which applied a mixed-methods approach to capture and understand a wide variety of changes to organic markets in the Central Andes region. This case demonstrates a good rationale for choosing a mixed-method design and also an authentic implementation that effectively mixes quantitative and qualitative data to enhance the value of each.
Mixed-methods design was defined by project team in similar way it was described by Jennifer Greene, as systematic integration of quantitative and qualitative research methodologies in all the stages of the impact evaluation in order to strengthen reliability of data, validate finding and deepen our understanding of the processes through which project outcomes and impacts are achieved, and how these are affected by the context within which the program is implemented.
Why did you choose this kind of design in this case?
The Hortisana project aimed to promote organic markets among smallholder horticultural producers. As well as providing training for producers, the project sought to organise groups of producers to support marketing of healthy agricultural products. The key evaluation question was: Did Hortisana interventions contribute to changed attitudes and/or practices of the participants?
This design of the evaluation was strongly affected by the context in which the project was implemented. The project was across two regional sites; the Andes of Ecuador and Peru, with different agro-ecological conditions and market access but a commonality in that neither place had very many healthy markets – even though some NGOs had worked on several organic production initiatives. The number of producers interested in entering an organic market, as well as learning new ways of planting vegetables and thinking about consumption of healthy food, was expected to be low.
Therefore, with low number of beneficiaries and the need for a process to get producers involved in a slowly growing market, we needed methodologies to map and capture changes, examine factors to understand those changes, explain the characterization of the different types of producers, and enrich the understanding of the changes by triangulation of the data.
Which methods did you use and how did you practically mix them in this case?
The methods we chose to fit this need were: theory of change, Most Significant Change (MSC), Q methodology, and a quantitative survey. These methods were used in a mixture of sequential and parallel gathering - see the figure below for a representation of this.
The theory of change was the first methodology to be applied in the early stages of the project, and allowed stakeholders and beneficiaries to contribute to a broad definition of project impact. This was followed by a MSC process near the end of the project which involved the collection and rating of short ‘significant change’ stories from project participants. After analysing results and identifying the areas of changes mostly reported by participants, the other two methodologies were applied in parallel; the Q methodology to assess the perceptions of beneficiaries in comparison with a referent group using words and phrases from the MSC process; and the quantitative survey to capture behavioural changes in categories revealed through the MSC process.
Graphical representation of the sequence and combination of methods used in the Hortisana evaluation.
Did the methodology throw up any surprises throughout the evaluation?
The methodology brought surprises right from the beginning when the theory of change exercise revealed that the local project administrators did not have a common understanding with the regional team of the project objectives. The Most Significant Change approach had a bias when the selection of stories was made mainly by producers rather then the expected balance between producers and institutional actors. We found for instance that stories with emotional content were selected more than others and the bias this introduced meant that we had to disregard the selection of stories as an output. But nevertheless the stories still provided useful information for the use of the Q methodology and the quantitative survey. Also, when analysing results from the Q methodology, we found that there was an important group of environmental conscious producers, which we weren’t expecting and which indicated great potential for scaling up the project if more actions to improve organic markets are made.
What advice/top tips can you give someone who is considering a mixed-methods evaluation design?
You need a broad understanding in your team of a range of research methods in order to choose the right mix and to fit the approach to your needs as well as the technical skills to apply them and adapt them to local contexts. Be prepared to look for creative ways of using information when one method does not give all the information you expected to collect. Be open to innovation and read about different methods, for example:
- For mixed-method approach: Integrating Quantitative and Qualitative Research in Development Projects
- For theory of change: The Community Builder’s Approach to Theory of Change: A practical guide to theory development
- For Most Significant Change: The ‘Most Significant Change’ (MSC) Technique: A Guide to Its Use
- For Q methodology: A primer on Q methodology.
The first post in this series was Mixed methods in evaluation part 1: a warm up. The third was: Mixed methods in evaluation Part 3: Enough pick and mix; time for some standards on mixing methods in impact evaluation.
[Image source: MacJewell/Flickr]