A new pathway: how can funders support meaningful monitoring, evaluation, and learning practice in the field?

By
Andrea AzevedoMegan Colnar
Illustration of people working together at a table

How can donors and grantees work together to create effective monitoring, evaluation, and learning (MEL) practices that drive field-wide transformation?

The Open Society Foundation’s Fiscal Governance Program found success by focusing on six key approaches, including empowering grantees and relinquishing power. In 2021, an external close-out evaluation by Intention to Impact of the program (which ran for 7 years and gave over $150 million in grants) revealed something pretty remarkable—the program’s deliberate focus on strengthening field-wide monitoring, evaluation, and learning practices was a success. Substantial capacity increases were observed across key institutions and grantees, new complexity-sensitive practices and methods were being actively championed and deployed, and a growing community of better-connected practitioners were exchanging tips and tricks on how to apply smart, context-specific MEL across fiscal governance issues. What’s more, in this evaluation, most grantees gave high praise to these efforts.

So, how did this come about? We detail the six different approaches we used in our new publication Setting new standards for better MEL: Lessons for funders and grantees. The approaches range from checking power dynamics to growing skills for evaluative thinking and seeding peer learning and field-wide research. The publication is paired with a toolkit and showcases resources we used and iterated on across the various approaches.

Above all, we hope to make it clear that this transformation was sought out deliberately, and was underpinned by our own experiences as grantees desperately trying to follow impenetrable, unchanging donor-mandated MEL in previous roles. Making the shift from ‘grantee’ to ‘funder’ can be disorienting, perhaps in few roles more so than for monitoring, evaluation, and learning. Far from being two sides of the same coin, the distinctive power dynamics in the grantee—funder relationship mean that MEL practices for both parties are overwhelmingly dictated by the latter and have traditionally had major implications for the continued funding for the grantee.

We saw a huge gap in field-wide MEL know-how and application across the fiscal governance field, but we also recognised how delicate and complicated donor interventions about monitoring, evaluation, and learning can be. We committed to narrowing the gap, and hoped that it was possible for a donor to spur field transformation by partnering with organisations and their internal MEL staff for change. After six years of this work, several rounds of grantee and external actor feedback, and an external evaluation, it seems our optimism was not misplaced.
Through years of values-driven practice—and a considerable amount of trial and error—we were able to demonstrate that funders and grantees can have productive, mutually respectful, and effective conversations about MEL. In our experience having resources to give a wide range of support (from grants to field innovations to peer learning) via a dedicated grantmaking budget was a game-changer. And, importantly, our program showed that donors who champion and advocate for flexible, trust-based MEL practice with and within their partner organisations could create positive ripple effects well beyond an individual grantee.

Important elements for success

You can read more about what we tried in detail in our Learning Brief. In this blog, we wanted to share some of the key ingredients that were crucial for success across all six of the approaches we deployed:

  1. A technically skilled internal team who understood the unique challenges complex systems change requires.
    It almost seems silly to emphasise this point, but in our experience, it can be really hard for donors (and grantees) to hire people with the right mix of M&E skills and experience, especially when they have allocated a limited headcount and resources. Often, rather than hiring for MEL roles outright inside foundations, other programmatic and operational staff suddenly find themselves saddled with ‘learning’ responsibilities without the knowledge and experience to deliver on these duties. In our experience, a traditional and field-tested MEL technical skillset is necessary but insufficient to enabling culture and practice change; foundations need folks in these roles who have experience in navigating systems change and complexity, given the systemic nature of the issues philanthropies and partners seek to address. Within our program, we deliberately chose to recruit folks with the right technical skills in MEL, research, and knowledge management, and demonstrated leadership experience on these issues at nonprofits. The combination of technical expertise and direct experience ‘on the other side’ in grantee organisations created a unique team with highly relevant offerings for colleagues, partners, and other funders on everything from evaluations to theories of change to learning agendas.
     
  2. An orientation towards long-term, systems-change and learning.
    The most popular and well-known evaluation methods in the field today are often least equipped to document and measure the complex mechanisms required for long-term systemic change; yet, for the most part, funder thinking and practice have not embraced MEL methods and expectations that better map to these intended impacts. Experimental evaluation techniques—like randomised control trials and their friends—have little role to playin the continuous learningrequired to unlock systemic change on issues like inequality, democracy, or corporate capture. We were keen to recognise the fact that change was going to happen over the long run. Our theory of change talked about impact 20-30 years in the future, so we needed MEL approaches and strategies that could simultaneously view ultimate aims (10+ years away), recognise incremental progress, and lift out learnings to improve adaptation and continuous progress towards longer-term goals. Using this dual horizon planning (i.e. for 10-30 years into the future and the 4-year strategy period) means fixing a defined point for the ultimate change sought and focusing on implementation and adaptation that paves the way towards that change over the long run. Importantly, it means acknowledging uncertainty, the need to shift strategies and tactics over time, and learning about what is really moving the needle in new and complexity-sensitive ways.
     
  3. Getting out of the way–relinquishing power and acting as facilitators, connectors, and fellow learners.
    Throughout this work, we were aware of our power to build a positive agenda and to play a constructive role in strengthening grantees’ MEL practices. Instead of taking our ideas and asking grantees to experiment with them, we decided to give grantees the agency to take the support offered as needed, keeping an open-door policy for our grantees to talk, share challenges and work together to overcome them. We were also purposeful in connecting grantees and giving them space-and most importantly, resources- to explore relevant ideas for their fields (e.g. via the Fiscal Governance Indicatorsproject) and their MEL work (e.g. Sharing Insights on Strategy and Learning for Fiscal Justice).   Positioning ourselves and our team as fellow learners with a deep respect for and commitment to our grantee partners’ autonomy and leadership enabled us to have conversations with organisations that donors are often excluded from. We became a resource for our partners that was grounded in a commitment to impact in our sector, collaboration, and humility.
     
  4. Prioritising collective learning over your individual donor needs and questions.
    Philanthropic donors are in a privileged position and have a wealth of possibilities yet to be explored when it comes to learning. Public funders (e.g. governments, multilaterals, or other development agencies) often have less flexibility to ‘change the rules’ of the game when it comes to MEL practice and requirements due to the public oversight and structures of these institutions (though there are notable exceptions and efforts underway to raise the bar like UNDP’s M&E Sandbox).  Private foundations, however, often set their own rules and make their own decisions about what, if, and how to feature MEL as part of their grantmaking practices; in many cases, this discretion rests with individuals or teams within foundations too. Having experienced the burden of donor-mandated outcomes and indicators, we decided early to let grantees set their own outcomes, indicators, and learning priorities (we just asked that they had these in place). It did mean more work for us (see Approach One in the linked brief), but it ultimately equipped grantees to better self-direct and self-advocate with other donors for the MEL priorities and activities they thought most relevant. As discussions about trust-based philanthropygain space, we hope to see more philanthropic funders shift away from predefined outcomes and indicators all together. In some cases, we know funders who are experimenting with dropping requirements for indicators altogether.
     
  5. Resources and leadership buy-in to advance a new vision and support transformative work in the field.
    None of this work would have been possible without the explicit commitment of EJP’s leadership to internal MEL and transformative MEL work for the field. And nowhere does the rubber meet the road more seriously than when it comes to budgets. Early on in Fiscal Governance Program’s MEL journey, our team effectively established two pools of funding for our MEL efforts. Getting buy-in from leaders and peers is often a multi-year process and inside a bueraucratic or large institution the goal posts are often shifing, especially if you are building out early or first-ever efforts. We were able to secure early buy-in from our program’s senior leadership and key peers by (1) delivering concrete, hands-on support on portfolio and grant design and (2) soliciting input and then commuicating clearly and regularly about the direction we were heading. Inside large institutions we suggest focusing on your program or area as a demonstration case, as campaigning for institution-wide change is a long game. In our case, we were able to make many great changes and test new approaches without needing to get approval and/or impact other teams internally. The one place where we really needed institutional approval and change was to get a workable, results and learning focused database to store and manage our information; despite our best efforts, we failed to pull this change off. Lessons from this experimentation will inform future developments at OSF though.

There was plenty that didn’t work too. Some of these flops and failures are featured in the learning brief, and in sharing out this work we expect more will be identified! But by most measures–and chiefly on the basis of overwhelmingly positive feedback from grantees–this work was successful. We hope exploring our lessons and resources offers donors and grantees new options and ideas to explore, and, if nothing else, that this case effectively demonstrates that donor MEL practice and policy can be done in respectful, learning-focused, and grantee-driven ways.

Related content

'A new pathway: how can funders support meaningful monitoring, evaluation, and learning practice in the field?' is referenced in: