C4D: Realistic
Pragmatic; mixed-methods; grounded; flexible
To be most effective, R,M&E approaches and methods need to be grounded in local realities. This requires openness, freedom, flexibility and realism in planning and implementing R,M&E and in the selection of approaches, methodologies and methods. This approach aims to increase the usefulness of evaluation results, which should focus on intended, unintended, expected, unexpected, negative and positive change. Long-term engagement with organizations and communities ensures effectiveness and sustainability, and a long-term perspective on both evaluation and social change.
Where do we start?
In order to make decisions about what is feasible and practical, it is important to understand what resources are available and seek additional resources where required. A good place to start is the Determine and secure resources task.
Incorporating and implementing critical approaches in practice
Manage (and commission) an evaluation or evaluation system
Determine and secure resources:
Securing the resources needed, particularly funding, for R,M&E of C4D is a common challenge. This task is a foundational task for being realistic in the approach to R,M&E of C4D.
Define ethical and quality standards for R,M&E:
In C4D the ethical standards should cover sharing results and findings in accessible ways (especially with marginalised groups and those who were consulted in the data collection and report writing process) as an ethical responsibility. This also helps with promoting a learning-based culture and continuous learning.
Document management processes and agreements:
Pay attention to the description of the Scope of Work and make sure it matches the funding available. Experienced consultants can see (and will avoid) Terms of References that ask too much within too little time and without adequate resources. Use the Determine and secure resources task to make sure the resources available match the scope and consider cheaper options.
Not all capacity building work should start from scratch. What existing systems and ‘communities of practice’ can be used to enhance capacities and strengthen networks? Prior to implementing capacity building ensure a capacity needs assessment (which could be rapid) has been undertaken.
Define
Develop an initial description:
This process can be useful for defining the boundaries (geographical and timeframe) of the initiative and R,M&E. It is important to be realistic about what kinds of outcomes or impacts can be expected within certain timeframes.
Frame
Specify the key R,M&E questions:
In C4D the questions should be written in a way that calls for need for various methods and tools that will capture people's voices.
Describe (to answer descriptive questions)
Choices about methods must remain practical, pragmatic, and feasible, and fit with the available resources. This may involve compromise to remain realistic, however, in C4D ensuring that local needs, voices and experiences are given prominence should remain a priority.
Combine qualitative and quantitative data:
As part of being realistic, the C4D Evaluation Framework advocates for the use of mixed-methods. This doesn't mean that every R,M&E activity must include both qualitative and quantitative data, however. For example, a qualitative study might be needed to fill gaps in quantitative data or indicators.
Additional resources may be required for analysing qualitative data (words-based data i.e. spoken or written, stories, interviews, questionnaires, focus group discussions, videos etc.). In C4D, qualitative data is often critical to understanding contexts and changes. Qualitative data analysis (summarising and looking for patterns and themes) can be more time consuming compared to quantitative data, and requires different sets of skills.
Understand causes (to answer questions about causes and contributions)
Investigate Causal Attribution and Contribution:
Feasibility and availability of expertise might be factors when deciding on methods for investigating causes.
Experimental and quasi-experimental designs (strategy 1) don’t necessarily take more time and resources, but they do depend on a number of practical factors including: upfront investment in planning and design; and the ability to plan the C4D intervention around the needs of the experiment.
Where these things are not possible, it might be more pragmatic to use:
- Strategy 2: Check the results support causal attribution and;
- Strategy 3: Investigate possible alternative explanations.
Synthesise
Synthesise data across evaluations:
Lower cost options, such as rapid evidence assessment, are useful where there is a need to realistically balance the available resources and the need for quality data and rigour.
Report and support use
While there are many great options that may be ideal for communicating with different groups, it is also important to be realistic about how many different options are feasible. There may need to be trade-offs in relation how may different media are used, the quality of production and other factors.
Challenges and strategies
In an ideal world there would be enough resources to do a perfect evaluation. In the real world, small budgets and a lack of time mean that compromises might be necessary. How do you decide where to compromise? And how to you maintain the integrity and usefulness?
It is important first to be clear about the resources that are available, and to think broadly about resources (including staff time, knowledge, existing sources of data etc.) and how to seek additional. Advice of this nature is outlined in Determine and secure resources. Thinking about the match between the design and the available resources is often something we have to return to. It involves thinking creatively to make the best use of resources. Are there adaptations that can be made to make data collection methods more 'rapid' and small scale? See Collect and/or retrieve data. Are there ways to value and synthesise tacit knowledge of stakeholders to achieve the goals? See Synthesise data across evaluations.
Indicators for C4D poses challenges in terms of feasibility and practicality. Existing data sets usually don't cover C4D dimensions, however, commissioning data collection for indicators is often not feasible.
Remember that indicators are signs or signals of progress. Although it often takes the form of population or household data, perhaps there are other things that might be 'good enough' signals of progress? Or are there proxy indicators that could be used (with a clear understanding from collaborators about the limitations)? Read more on the Use measures, indicators or metrics page.
Data collection systems using technology can be a realistic solution. Although it requires some upfront investment, in the long term household surveys are much more feasible. See the T-Watoto case example.
Case example
In Vietnam the assessment of the VAC Campaign had a relatively small budget of approximately $10,000USD. The original plan for the assessment had to be scaled back to be feasible within the scope. However, with careful planning and strategic selection of samples (two field sites), methods and tools, and data sources, a useful report that met the needs of the key users was achieved.
Barefoot M&E
The Barefoot Impact Evaluation methodology was developed in the context of a UNESCO/UNDP Media Project in Mozambique (see the Communication Initiative website) as a cost-effective, simple and practical R,M&E methodology to be designed and implemented by community radio, with little or not external support. It uses a range of local tools and solutions to build R,M&E plans around the opportunities that are available. It was designed to be just enough to 'check the pulse' of the radio, but not too burdensome. The techniques used have wide applicability, and could be adapted to suit a range of different C4D NGO and other contexts. Some of the realistic, barefoot techniques include:
- an internal self-assessment 'check-up' using a checklist
- 'hearing out' the community, where informal interviews with community members on their satisfaction are added onto routine contact with communities
- Registration of callers and letters to the station, with forms left by the phones so that demographic information of callers can be recorded
- feedback questions on the back of message slips (message slips are primarily to request announcements are made, but 30% of people also filled in the questionnaire on the back)
- interviews with people living in the staff members' neighbourhood, which enable some spread of the sample
- interviewing at public events
- some M&E is undertaken by a 'community mobilizer', who is a paid staff member at the station and is trained to undertake more in-depth focus group discussions and interviews.
This exemplar is consistent with the C4D Evaluation Framework in the following ways:
- Realistic The low-cost 'barefoot' approach focuses on make the most of limited resources. Although does not meet academic standards in terms of sampling and rigour, it is good enough for the context in which is it to be used.
- Participatory The approach is intended to be managed and implemented by community radio stations with a nominated community mobilizer.
- Learning-based The key users of the assessments are the community radio stations themselves. If they use it for learning and improving, the M&E is meeting the purpose.
Resources
Expandir para ver todos los recursos relacionados con 'C4D: Realistic'
'C4D: Realistic' is referenced in:
Marcos/Guías
- Communication for Development (C4D) :
- Communication for Development (C4D) :
- Communication for Development (C4D) :
- Communication for Development (C4D) :
- Communication for Development (C4D) :
- Communication for Development (C4D) :