From paper to practice: Supporting the uptake of high-level M&E frameworks

from_paper_to_practice_blog.png

Evaluation frameworks are often developed to provide a common reference point for evaluations of different projects that form a program, or different types of evaluations of a single program. 

But getting agreement on a shared document is only the start of achieving the intended benefits of evaluation frameworks, such as reduced duplication and overlap, improved data quality, and ease of aggregation and synthesis. This guest blog by George Argyrous (ANZSOG) outlines 9 actions that can be taken to support the implementation of high-level monitoring and evaluation frameworks, and make sure these frameworks don't languish on a dusty shelf.

I have recently been involved in the development of a high-level national monitoring and evaluation framework for disaster recovery, intended to be used at state and territory level and for individual evaluations. From this work I have developed a list of recommendations for how to embed such high-level frameworks, once they are written. I call this list ‘from paper to practice’; the premise being that even the best-written framework is useless unless someone is actually using it! 

The list was generated by holding a series of five structured workshops with key people involved in this policy domain who are responsible for implementing the framework. The first workshop began with an open question about how to embed the framework in practice. From this a list of suggestions for embedding the framework was developed and then expanded upon in an open group discussion. The list was then used to prompt discussion in subsequent workshops to further refine the list and discuss its feasibility. 

1. Ensure senior management understand the principles embodied in the framework and communicates to all staff the need to follow its principles.

Briefing senior managers on the substance of the framework can be more effective than expecting them to read the framework in its entirety. Similarly, providing them with briefing notes they can use to communicate the importance of the framework to staff can help them drive the message. 

2. Reference the Framework in relevant material and documentation.

This includes 'planning’ documents that exist at various levels of government, such as handbooks and operations manuals. It also includes staff ‘induction’ processes and documents, on intranets, and in tender documents involving evaluation work. 

3. Communicate the role and principles of the framework to those working in other related policy fields.

An evaluation framework may be developed within a particular policy domain, such as in education for student learning outcomes. However, these outcomes may also be regularly addressed in other policy domains that overlap with this one, such as youth services. Thus a framework developed ostensibly for evaluating the education system’s progress toward achieving student learning outcomes might also be relevant for understanding how to evaluate the provision of youth services that also might have learning outcomes as program objectives. 

4. Communicate the role and principles of the Framework to NGOs and other external organisations involved in program delivery. 

Program delivery is often done by governments in partnerships with other organisations such as NGOs. The role of the Framework should be made clear to these bodies who have an ongoing relationship with an agency, especially in terms of the major outcomes that the Framework identifies. In disaster recovery, for example, the Red Cross implements its own recovery programs as well as providing services for government programs. Briefing the Red Cross on how the framework can support them on these activities will increase the scope of its use.

5. Coordinate with other relevant government agencies that have ‘overlapping’ program objectives.

A Framework developed around a broad policy area, or at a broad organisational level such as a whole agency, may share major objectives with other broad policy areas or agencies. For example, disaster recovery can also overlap with a national strategy for asbestos removal. Respective Frameworks can ‘cross-reference’ each other explicitly in the documents, or else by making staff involved aware. 

6. Engage with relevant public sector evaluation units.

Many government agencies have evaluation teams with high levels of expertise. These teams cover a range of agency-wide issues that can be larger than those in a particular evaluation framework developed by a branch within an agency or for a policy domain that cuts across agencies. To effectively draw on this expertise when conducting evaluative activity, the content of the Framework can be proactively discussed with them and how they can support its implementation. For example, in disaster recovery this involves a number of specific agencies including human services, education, and transport, each of which usually have their own internal specialist evaluation units.

7. Build targeted evaluation capability among the various groups that might be involved in implementing this Framework.

A framework should not assume that all relevant people have the skills and knowledge needed to implement it. Moreover, the skills and knowledge needed will vary across types of users. For example, program design staff will need some working knowledge of how to design program logics and how to commission evaluations that will satisfy the requirements of the Framework. This will not be the same set of capabilities that might be expected of the team of people who will undertake the evaluations. Targeted support, through activities such as workshops, mentoring, and case-studies, can help build relevant capability to implement the framework. In disaster recovery, this has involved a series of ‘implementation workshops’, structured around a set of hypothetical disaster recovery scenarios that formed the basis upon which the framework was applied. These workshops were also used to build cross-agency and cross-jurisdictional networks of practitioners interested in disaster recovery program evaluations; ‘teams’ were formed to work on the scenarios to maximise this network building objective.

8. Ensure funding for evaluative activity is built into recovery planning and implementation budgets ‘up front’. 

Evaluative work, including evaluations as such, need to be resourced. Budget should be set aside in advance to support this and also be linked to the use of the framework. For example, in disaster recovery funding from the Commonwealth to the States, reference is explicitly made to the national framework, and explicit funding is allocated to the evaluation of the program.

9. Add this list to the evaluation framework, and explicitly address embedding the framework in the framework itself. 

A high-level evaluation framework should explicitly discuss within its own structure how it is to be embedded. Indeed this list could be used as a starting point for providing this discussion.

Share your thoughts

We'd love to know if you have any other ideas for embedding high-level frameworks, or any feedback on the above suggestions!