Holistic Transformative Evaluator Framework: Addressing Technical and Gender-Responsive Training Needs in Asia-Pacific

Supported by the FIME Small Awards Program, this research paper by Mahesh Krishnan Ramesh examines gaps in gender-responsive evaluator training across the Asia-Pacific region, identifying a “methodological disconnect” between dominant linear, positivist approaches and the lived realities of marginalised and gender-diverse communities. 


What are the key features of the resource?

The main objective of this research is to identify and address gaps in technical, personal, interpersonal, and cross-cutting gender-related competencies in evaluator training within the Asia-Pacific. The project focuses on challenging the "methodological disconnect", a structural gap where standardized, text-based evaluative tools fail to capture the complex power dynamics and indigenous fluidities of marginalized, gender-diverse, and neurodivergent populations.

The paper provides a detailed landscape analysis benchmarking 18 regional and global evaluation training programs against feminist and transformative principles. It features deep qualitative insights from nine global evaluation experts. Additionally, it showcases practical arts-based methodologies prototyped at the 2025 OMLC Learning Lab in Sri Lanka, such as participatory "Wall Comics" and "Pareidolic Tracing". 

The resource culminates in the "Holistic Transformative Evaluator Framework," a competency model structured around five lenses: Political (Feminist Foundations), Contextual (Decolonization), Inner (Cognitive Pluralism), Methodological (Complexity), and Social (Relational Ethics).

What contribution does the resource make to Feminist Evaluation? 

This resource moves beyond standard technical mechanics to reclaim "Cognitive Pluralism". It legitimizes intuition, emotion, and semiotic literacy as rigorous technical competencies necessary to understand complex intersectional realities. The framework shifts the pedagogical paradigm from static "cultural competence" to an ongoing practice of "decolonial humility". Crucially, it operationalizes power-sharing by redefining participation, mandating that individuals with direct, lived experience be hired directly into evaluation team compositions rather than being treated merely as data subjects or respondents.

How can other people use this resource?

Training institutions, evaluation networks, and academic bodies can use the 5-Lens Framework as a robust benchmarking tool to audit and redesign their capacity-building curricula. Commissioning agencies and donors can use the recommendations to structure inclusive Terms of Reference (ToRs) that prioritize hiring evaluators with lived experience and competencies covered in the Holistic Transformative Evaluator Framework.

Furthermore, M&E trainers can adopt the paper's arts-based pedagogical recommendations, such as simulation games and semiotic visual tools, to create a "safe sandbox of failure" for evaluators. These tools act as a "magic circle" where evaluators can play, test complex systemic scenarios, and experience failure without causing real-world harm. Crucially, these experiential spaces allow evaluators to engage in critical dialogue, practice reflexivity, and identify their own deeply held biases and positionality.

'Holistic Transformative Evaluator Framework: Addressing Technical and Gender-Responsive Training Needs in Asia-Pacific' is referenced in: