C4D Hub: Simple, Complicated, Complex: meanings and implications

The Simple, Complicated, Complex typology

 

Simple, ‘known’  

Standardised – a single way to do it

Works pretty much the same everywhere / for everyone

Best practices can be recommended confidently

Knowledge transfer

Impact focus: did it work or is it still working?

Complicated, ‘knowable’ 

Adapted – need to do it differently in different settings

Works only in specific contexts that can be identified

Good practices in particular contexts

Knowledge translation

Impact focus: what worked for whom in what ways and in what contexts?

Complex, ‘unknowable’  

Adaptive – need to work it out as you go along

Dynamic and emergent

Patterns are only evident in retrospect

Ongoing knowledge generation

Impact focus: what is working in the current conditions? What is the best way forward at this point in time?

Identifying simple, complicated and complex/adaptive aspects and understanding the implications 

The tables below outline seven aspects of the initiative and changes and how this looks different in simple, complicated and complex/adaptive situation. Below each is an explanation of the implications. 

Stakeholders perspectives and involvement 

Aspect Question

Simple

Complicated

Complex/Adaptive

1. Focus

Does everyone share the same objectives?

Everyone shares a single set of objectives

Different objectives valued by different stakeholders.

(competing objectives, different objectives at different levels)

Adaptive/responsive objectives

 

Implication

Impacts to be included can be readily identified from the beginning.

See: Develop initial description

Need to identify and gather evidence about multiple possible changes. 

See: Determine what 'success' looks like and work with stakeholders to understand similarities and differences. This can inform the process of Developing initial description. Later, there will need to be a process to Synthesise data and weigh benefits from a single study/evaluation to ensure different criteria are considered.

Need nimble impact R,M&E systems that can gather adequate evidence of emergent intermediate outcomes or impacts.

See: Determine what 'success' looks like and work iteratively with stakeholders to understand similarities and differences. This can inform the process of Developing initial description of what is to be evaluated. Later, there will need to be a process to Synthesise data and weigh benefits from a single study/evaluation to ensure different criteria are considered. 

2. Governance (Management and decision making)

Who has responsibility for management and decision making?

Single organisation

Multiple organisations (which can be identified) with specific, formalized responsibilities

Adaptive/responsive list of organizations working together in flexible ways

 

Implication

Primary intended users and uses easier to identify and address

 

See Understand and engage stakeholders

Likely need to negotiate access to data and ways to link and co-ordinate data.

Might need to negotiate parameters of a joint impact evaluation, including negotiating scope and focus.

See: Establish Decision making processes and Decide who will conduct the research/ evaluation (or other studies for monitoring), which will be especially important in joint or shared R,M&E. It will also be important to negotiate while Developing Planning Documents and Document management processes and agreements. Regular Review RM&E systems and studies processes will be important.

Need nimble impact R,M&E systems that can gather evidence about the contributions of emergent actors and respond to the different ways they value intended and unintended impacts.

See: Iterative processes of Understanding and engaging stakeholdersEstablishing Decision making processes and will important. Iterative processes of open communication will be required when  Developing Planning Documents and Document management processes and agreements. Regular Review RM&E systems and studies processes will be important.

 

The C4D approach in the context of the ‘problem’

Aspect

Question

Simple

Complicated

Complex/Adaptive

3. Consistency


 

How much variability is there in the C4D approach(es) to be used in the initiative?

Standardized – one-size-fits-all (C4D) program.

Adapted – variations of a programme planned in advance and matched to pre-identified contextual factors.

Adaptive – evolving and personalised program that responds to specific and changing needs.

 

Implication

Quality of implementation should be investigated in terms of compliance with ‘best practice’.

This can be noted in Specify the key R,M&E questions, to develop quality indicators see Use measures, indicators or metrics

Quality of implementation should be investigated in terms of compliance with the practices prescribed for that type of situation.

This can be noted in Specify the key R,M&E questions, to develop quality indicators see Use measures, indicators or metrics

Quality of implementation should be investigated in terms of how responsive and adaptive service delivery was.

This can be noted in Specify the key R,M&E questions, to develop quality indicators see Use measures, indicators or metrics. Key R,M&E tasks in being responsive includeDevelop recommendationsSupport use

4. Necessity


 

How many different options are there for solving the problem? To what extent is this exact initiative needed to solve the problem?

There is only one way to achieve the intended impacts.

Works the same for everyone.

The (C4D) intervention is one of several ways of achieving the impacts, and the options can be identified.

Possibly one among several ways of achieving the intended impacts (uncertain).

 

Implication

Counterfactual reasoning appropriate.

See: Options for creating a counterfactual

Counterfactual reasoning not appropriate as it does not accept a causal relationship between the intervention and the impacts unless they would not have occurred in the absence of the intervention.

To understand causes and contributions see Checking consistency of results, and Ruling out possible alternative explanations

Counterfactual reasoning not appropriate as it does not accept a causal relationship between the intervention and the impacts unless they would not have occurred in the absence of the intervention.

To understand causes and contributions see Checking consistency of results, and Ruling out possible alternative explanations

 

5. Sufficiency 
 
To what extent will the problem be solved by the C4D initiative alone? The (C4D) initiative is enough to produce the intended impacts. Works the same for everyone. Works only in specific contexts which can be identified (e.g. implementation environments, participant characteristics, and support from other interventions). Works only in specific contexts which are not understood and/or not stable.
 

Implication

Counterfactual reasoning appropriate

Reasonable to ask ‘Does it work?’

See Options for creating a counterfactual

Impact evaluation question needs to be ‘For whom, in what circumstances and how does it work?’

Counterfactual reasoning only appropriate if the causal package of supportive context and other activities can be identified and included.

See Options for creating a counterfactualChecking consistency of results, and Ruling out possible alternative explanations. Explore all three strategies.

Impact evaluation question needs to be ‘For whom, in what circumstances and how does it work?’

Counterfactual reasoning not appropriate as the causal package of supportive context and other activities is changing and/or poorly understood and cannot be adequately identified.

To understand causes and contributions see Checking consistency of results, and Ruling out possible alternative explanations

 

 

Predictability of changes

Aspect

Question

Simple

Complicated

Complex/Adaptive

6. Change trajectory (how impact variables will change over time – for example, straight line of increase, or J curve)


 

To what extent are the relationships between variables (e.g. exposure to messages and behaviour change) understandable and predictable?

Simple relationship (cause and effect). Predictable.

Complicated relationship that needs expertise to understand and predict.

Emergent factors and multiple causes, sudden changes (tipping points) that are unpredictable. Can only understanding retrospect.

 

 Implication

Measurement of change can be done at a convenient time and confidently extrapolated.

 

See Use measures, indicators or metrics

Timing of the measurement of changes should be undertaken when it will be most meaningful – expert advice will be needed.

 

see: Decide who will conduct the evaluation, timing of studies should be reflected in: Develop Planning Documents (Evaluation Plans and M&E Frameworks) 

Changes will need to be measured at multiple times as the change trajectory cannot be predicted.

 

Timing of studies should be as often and flexible as possible with the resources, see Determine and secure resources. Decisions about timing should be reflected in: Develop Planning Documents (Evaluation Plans and M&E Frameworks).  Analyse data is an important task, and the skills required should inform Decisions about who will conduct the evaluation.

7. Unintended impacts


 

To what extent are unintended impacts predictable?

Easily predictable and solvable.

Likely only in particular situation; need expertise to predict and address.

Unpredictable only identified and addressed when they occur.

 

 Implication

Need to draw on previous research and common sense to identify potential unintended impacts,which can be done upfront, and gather data about them.

Need advice from experts about potential unintended impacts and how these might be identified. See Decisions about who will conduct the evaluation.

Need to include a wide net of data collection that will catch evidence of unexpected and unanticipated unintended impacts. The process of identifing potential unintended impacts should happy iteratively during  implementation. Measures, indicators or metrics might be useful, but probably need to be complimented with other methods to Collect and or retrieve data. Key tasks include: Develop recommendations and Support use in order to avoid doing 

Comments

There are currently no comments. Be the first to comment on this page!

Add new comment

Login Login and comment as BetterEvaluation member or simply fill out the fields below.