Reflections from BetterEvaluation’s outgoing CEO, Patricia Rogers

Patricia Rogers's picture 9th September 2021 by Patricia Rogers

As of September 9th, BetterEvaluation’s inaugural CEO, Patricia Rogers has passed on the reins to the BetterEvaluation team and Mark Madden, the interim CEO. In this blog post, Patricia shares some of her highlights over the past 12 years.

Leading BetterEvaluation for so many years has been an incredible privilege, but in stepping down from BetterEvaluation’s helm I am excited to be able to return to what I love and spend more of my time grappling with some of the methodological challenges of our field.

As the BetterEvaluation knowledge platform moves into its next phase, I wanted to share some reflections on the journey so far and some of my personal highlights.

Evaluation matters and BetterEvaluation matters

BetterEvaluation began as a series of conversations 12 years ago about the need to improve evaluation quality and ways to do this.  Evaluation holds the promise of supporting better decisions and making evident the value of what is done – and contributing to better outcomes for people and the planet.  But when done badly, evaluation can lead to bad decisions, decrease motivation and misdirect efforts.

And it was evident that many of the lessons learned about evaluation – such as the risks of making evaluative judgments solely based on a few indicators – were forgotten or not known by those coming into evaluation, resulting in mistakes and poor practices being perpetuated.

Beyond advocacy for particular methods to advocacy for situationally appropriate evaluation

Some approaches to improving evaluation focus on promoting particular methods, processes, or approaches, such as the promotion of Randomised Control Trials (RCTs) as the ‘Gold Standard’ of evaluation. However, there is no one-size-fits all approach to evaluation. Instead, methods, processes, and approaches need to be situationally appropriate – to suit the nature of what is being evaluated and the purpose of the evaluation and fit within time, money, and data limitations. 

The corollary to this was that knowledge about how to do evaluation better needs to be co-created and shared by people with experiences in diverse contexts, and that those involved in evaluations need to be honest about the strengths and limitations of evaluations and their design.

BetterEvaluation aimed to support users to choose the most appropriate methods, processes, or approaches and to apply them well or to manage effectively those who were doing this.

One of the main inhibitors of this was that there was no one place people could turn to find a  comprehensive list of methods, processes and approaches. And this is what BetterEvaluation set out to fix.

The very act of providing lists of methods and processes was seen as potentially valuable and a form of advocacy in itself.  BetterEvaluation aimed to highlight tasks and methods that were often left out of evaluation guides. In particular, during several years of heated disputes about methods and designs for impact evaluation, it was clear that having a list of non-experimental options to add to existing lists of experimental and quasi-experimental options could help inform better individual decisions and discussions and efforts to improve practice. 

The Rainbow Framework of evaluation tasks, which provided the “filing system” for methods and processes also made it clear that the upfront framing of evaluation purposes was as important as the technical design stage.

In recent years we have focused on how evaluation can be more useful under conditions of ongoing uncertainty and rapid, unpredictable change, including the COVID-19 pandemic, and the interconnected and systemic challenges of equity and environmental sustainability.

People work together to build and improve BetterEvaluation

Throughout BetterEvaluation’s development, we have worked with evaluators and evaluation managers to curate and co-create information to publish on our knowledge platform.

While some of this information was available from conference presentations and other reports, these were often ephemeral, hard to find or of varying quality. Project websites often disappeared when a project ended, and conference presentations were often not readily available. One of our aims was therefore to provide an accessible and durable repository for existing material.

When I was reflecting on the various places that were part of the origins of BetterEvaluation, I was astonished by the range of places, events and people who were involved and contributed to turning ideas into something practical.  These include African Evaluation Association conferences in Cairo, Yaounde and Accra; American Evaluation Association conferences in Minneapolis, Washington, DC, Baltimore and Denver; Australasian Evaluation Society conferences in Melbourne, Canberra and Brisbane; Aotearoa New Zealand Evaluation Association conferences in Auckland; European Evaluation Society conferences in Maastricht and Helsinki; the GIZ international conference on systemic approaches in evaluation in Eschborn; the Pacific MEL convening in Suva; the Community of Practice in Evaluation South Asia conference in Kathmandu; the Association of Professional Evaluation conference in Port Moresby; meeting of the Outcome Evaluation Learning network in Beirut; a CGIAR conference in Brasilia and meeting in Cali; IPDET courses in Ottawa; Asian Evaluation Week conferences in Xian and Chengdu; and key meetings in New York, Utrecht and Melbourne.

Some of my personal highlights include the live knowledge creation via events. For example, at the 2013 SAMEA conference, we walked through the early version of the Rainbow Framework and got feedback on the proposed tasks. At an AES conference, we used our conference booth to collect ideas about addressing particular tasks – and then added these options. At an AEA conference, we invited people to engage in an interactive session to share ideas about supporting evaluation users to actually use evaluation findings.

Another highlight was our innovative virtual writeshop process, where individuals were supported to write up an account of a particularly interesting example of practice as an evaluator or as an evaluation manager – including discussion of developing and using rubrics, evaluating a high-profile project, contrasting perspectives and experiences of an evaluator and an evaluation manager on the same project, including different perspectives of different project participants, and ensuring evaluations listen to the voices of children affected by projects.

And it’s been an absolute privilege to work with the long list of individual contributors to page content – volunteers from around the world who have shared their experiences and expertise to write or improve a method, approach or thematic page so that their knowledge about how to do evaluation better can be shared and used.

Being truly global

It’s thanks to our contributors and the knowledge that has been shared that BetterEvaluation has become such an important global resource.

In the early days, we watched the google analytics map of users excitedly as it gradually filled in every country. Today, BetterEvaluation.org is visited by over 1.2 million users a year. The growth in users from the Global South has been particularly exciting over recent years, which has grown over 50% since July last year, bringing the number of Global South users nearly on par with users from the Global North.  

Snapshot of usage from the Global South, 1st July 2020 – 30th June 2021:

Map: Usage in the Global South between July 1, 2020 and June 30, 2021

People find it useful in many different ways

Usage of our materials goes well beyond just looking at the numbers. I’ve always been delighted to hear about the different ways that individuals and organisations use BetterEvaluation.

Individual evaluators have used BetterEvaluation to learn more about unfamiliar methods, processes, and approaches and discover even more options.

Individual evaluation managers have used BetterEvaluation to plan a whole evaluation (using the GeneraTOR and the Manager’s Guide to Evaluation) and to review consultants’ proposals. If an evaluation proposal refers to an unfamiliar method or approach, the manager can draw on BetterEvaluation to get an understanding of what that would involve – and can sometimes use this to discuss it with the consultant and ensure they have the same understanding of what this would involve and why it might or might not be appropriate. Organisations have used BetterEvaluation to develop guidance and evaluation policies and frameworks. 

Evaluation educators and trainers have used BetterEvaluation as a general resource or as an overall framework for thinking about evaluation. Experienced evaluators have often used it as a reference to recommend to novice colleagues. An evaluation organisation has included BetterEvaluation as part of orientation for new hires, asking them to choose an unfamiliar method, process or approach and prepare a presentation for other staff about it.

Evaluation researchers have used the list of evaluation approaches to systematically identify distinguishing features of a new approach.

BetterEvaluation has also had broader impacts in providing a conceptual framework for thinking about different types of questions that evaluation seeks to answer and embedding core ideas into evaluation planning, such as the importance of framing and a utilization-focus - ideas which are now commonplace for evaluators but not always known to evaluation managers.

And on to the future

To everyone who has been part of this journey so far – thank you.

While we've come a long way, there is still a lot of work to do to get evaluation to where it needs to be to face the pressing challenges of our time. I look forward to continuing to work with our global community to make evaluation better.

A special thanks to this page's contributors
Author
CEO, BetterEvaluation.
Melbourne.

Comments

Nancy White's picture
Nancy White

Oh Patricia, what a journey. Congratulations on what you have inspired, co-created and nurtured. Wishing you the best on your next journey!

Anonymous's picture
Fred Carden

Patricia
I remember the early days when this was an idea in your mind. What a great addition BetterEvaluation has become to the evaluation community. Congratulations!

Add new comment

Login Login and comment as BetterEvaluation member or simply fill out the fields below.