52 weeks of BetterEvaluation: Week 46: An ethnography of evaluation - learning about evaluation from the inside using video
Conveying the complexities of the evaluation process isn’t easy, but video is one way to tackle the challenge. Quimera is a film company which was contracted to record the process of evaluating the USAID Growth with Equity in Mindanao III (GEM 3) project in the Philippines. In the second blog of our series on video in evaluation, Paul Barese from Quimera offers a few thoughts about using video as an evaluation learning tool.
Read the first blog in this mini-series on video, Week 45 of BetterEvaluation: Participatory Video for M&E. Next week, Glenn O'Neil, coauthor of the Intelligent Measurement blog, describes using video to communicate evaluation findings.
We were invited by the evaluation firm, Social Impact, to join the evaluation team and spend three weeks exploring and documenting one of the first project evaluations conducted under USAID’s new evaluation policy (read Mathia Kjaer's blog about the framing of this evaluation). The video will be used as part of the curriculum in USAID and US Department of State evaluation training courses, to convey some of the challenges evaluators encounter in the field (particularly in conflict-affected areas), to illustrate various types of interviews, interview techniques, and interview dynamics in action, and to allow people at USAID and beyond to better understand the complexity and value of evaluation.
The power of video for this use was evident, for example, the video can show you quite clearly the way that interview dynamics shift during the course of a meeting, and how interviewees react and respond to Filipino members of the evaluation team as compared to the way they respond to non-Filipino members of the team, how the language of the interviews bounces between English and Filipino, and the way that team members manage these dynamics “on the fly” to collect the best data possible in the given circumstances. The video also allows the viewers to observe the team conducting the evaluation in a conflict-affected area and how the team manages logistics, frequent changes in scheduling due to security risks, having to manage interviews with security escorts, sometimes in uniform, sometimes in plain clothes.
At the end of each day I conducted interviews with evaluation team members and this provided an opportunity to capture reflections on the day’s events, processes, challenges with data collection, and how members plan to move forward and address issues and concerns. Watching this process of activity and reflection in real time is engaging, powerful, and extremely instructive.
There were two important factors that contributed to the success of this project. The first related to a common concern that often arises when discussing evaluation (and research) and video – the impact that a video camera will have on the subjects of data collection and ultimately the quality of the data. In this case, after some apprehension, the team found that the cameraman was able to be unobtrusive and avoid attracting too much attention from the people being interviewed. It may have helped that interviewees and focus groups participants had still cameras, video cameras, camera phones, and tablets and were themselves capturing still and moving pictures, indicating familiarity and comfort with digital media. We also made it clear that they need only ask for something to be ‘off the record’ and we would switch the cameras off.
The second factor was that the video crew had a good understanding of project evaluation and development as well as video concept development, production and editing. This was important because we needed to understand and recognise relevant issues as they arose during the course of the evaluation team’s work (interviews, focus groups, community focus groups, logistics, project management) and know what themes and topics to explore with team members during the interview process. The evaluation team did not “script” its story, rather, part of our responsibility was to help the team explore, discuss and capture the story in real time as the evaluation was unfolding.
Although this video focused on the evaluation process and the main characters are the evaluators and the commissioners of the evaluation, the parameters could have been changed or expanded to create a video focused on evaluation findings and recommendations. Or a video could have been developed to focus on knowledge management, best practices and lessons learned from the project. In other words, a number of other approaches could have been taken relating to evaluation and video, with video products linking to communications and public relations. This is an important point to keep in mind: when it comes to video from the field there can be a number of constituencies within an organisation interested in some or all of the footage, so finding ways to collaborate across organisational functions (M&E, knowledge management, communications) not only helps to facilitate buy-in and collaboration between departments but also shares costs across departments, to maximize production efficiencies.
Watch two short clips from the final video. The full video is available here.
Additional resources
Develop evaluation capacity
Evaluation capacity includes developing an evaluation culture of valuing evidence, valuing questioning, and valuing evaluative thinking. Some people also refer to evaluation capability - the ability to actually use capacity.
Read more.
Building an evaluative culture for effective evaluation and results management
This brief from the Institutional Learning and Change Initiative (ILAC) provides advice on building an evaluative culture allowing organisations to better manage and provide programs and services.
Read more.
Photo: Norma Capuyan, vice chair of Apo Sandawa Lumadnong Panaghiusa sa Cotabato (ASLPC) - Keith Bacongco/Flickr.
'52 weeks of BetterEvaluation: Week 46: An ethnography of evaluation - learning about evaluation from the inside using video' is referenced in:
Blog