Scaffolding new methods - examples
These examples have been contributed for discussion at the 'flipped conference' session of the American Evaluation Association to be held at 11.15am - 12 noon on Saturday November 11, 2017 in the room Thurgood Marshall East, at the Washington Marriott Wardman Park.
Example 1 - Learning Circle by Jane Davidson
One of the best things about being part of the New Zealand evaluation community were the times when we put our heads together to figure something out.
In this case, I had been using rubrics for a while, but only on the kinds of projects that tended to land into my own consulting practice. I had shared some of these examples with colleagues and friends, and some of them decided this methodology would be a great fit for some work they were doing.
One of my colleagues, Julian King, has a background in economics and was interested in applying rubrics methodology to the question of Value for Money, or Value for Investment as he now calls it.
We had a couple of phone calls about it, but because this particular application was taking our ideas to a whole new level, we decided to get together and work it through face to face.
There really is no substitute for getting a diverse set of brilliant minds in the room, and that's what happened that day. Each person was able to bring their own knowledge and expertise about project contexts, culture, client needs, what VfM questions evaluation needed to answer, what principles and values were relevant for defining "value" in various contexts, and of course the knowledge, skills, and nuanced know-how about economic cost analysis methods and how to make rubrics work in tricky situations.
It wasn't at all like a mini workshop where one person "teaches" the others; it was more like a "group grappling" where we puzzled through it together to figure out what would work, and what would be the best ways to explain it to clients and other stakeholders.
I think one of the things that was most valuable for all of us was being able to listen to each other thinking out loud and doing that thinking out loud together. It's quite a different way of learning from listening to a polished and rehearsed presentation that has been simplified to fit into a conference time slot and then trying to apply it to an overly simplified example of our own under the impossible time constraints of a group exercise.
Several of this same group came together at different times, face to face or on Skype, to work through the application of this and related methodologies to various cases. I found it greatly sharpened my own understanding of rubrics methodology - partly because we were adapting it to new contexts and projects and blending it with broader expertise, but partly because it was forcing me to explain things that I hadn't consciously thought about but was intuitively doing.
I asked Julian to comment on the experience from his perspective:
"From the get-go I found rubrics intuitively appealing, easy to understand in theory, but at times incredibly challenging to develop and use in practice. I can't overstate the value of a band of fellow travellers, to wrestle with the hard stuff, innovate and problem-solve together. It is through this process that we have developed a decade's worth of practice-based knowledge and experience in evaluative reasoning."
-- Julian King
Julian and his Kinnect Group colleagues have written about how rubrics have enhanced their evaluation practice in an open source article in JMDE, which covers many aspects of the authors’ learning journey in gaining practical mastery of rubrics methodology.
Example 2 - Scaffolding learning in healthcare sector - Mads Teisen (Capital Region of Denmark / University of Copenhagen)
*Mads Teisen will be taking part in the AEA17 Flipped Session on Scaffolding Methods on Saturday to discuss this example further.
What was the evaluation method/process/approach?
Monitoring and evaluation in healthcare sector.
What did you use to learn how to actually apply it in practice?
Integration of evaluation, LEAN, and improvement theory, inclusion of all levels of a large organization.
How did it work?
It required a LOT of training, including the use of data on whiteboards regularly (LEAN: Kaizen meeting), developing knowledge on SMART indicators and including training on improvement theory (PDSA circles). 2,000 whiteboard meetings were held weekly cross the organization. It went well. So so. At top management level: perfect. At employee level: heavily defendant on management follow up.
What factors helped or hindered this?
Being "something new", along with lots of top management attention, made implementation more or less easy. But sustainability is a huge problem. The management attention has shifted, and it's up to lover level management to keep the system alive. We get a helping hand from politicians, who rely on the information provided, and we are helped by the burning platform of demography. We MUST improve, and change our healthcare programme.
Example 3 - Theory of Change facilitation - Jess Dart (Clear Horizon)
What was the evaluation method/process/approach?
A combination of experiential training, first with a hypothetical example, then with a real example. Then a number of deliberate steps, from practicing with peer feedback, observing experts in action, reflection, then beginning to practice it in pairs, to getting expert review on the final products. It can take years to learn how to facilitate this well.
How did it work out?
We do this with our new staff, it works quite well for some people, but it does depend on the starting competency. Raw talent, and prior skills such as facilitation skills, and understanding implementation, and prior programming experience all make a difference. And of course, a willingness of the person constantly learn and seek feedback.
What factors helped or hindered this?
Dedication and persistence to building capacity helps. Multiple inputs and iterative learning help. Reflection helps. Being given safe spaces to practice helps.
Related content
'Scaffolding new methods - examples' is referenced in:
Blog