Evaluation rubrics: how to ensure transparent and clear assessment that respects diverse lines of evidence

This report provides a detailed description of an evaluation, written by Judy Oakden, as part of the first BetterEvaluation writeshop process, led by Irene Guijt. Peer reviewers for this report were Carolyn Kabore and Irene Guijt.   


Independent external evaluators generally have to work within a range of constraints. Often there is less than ideal availability of time, money, or data. This article presents an example of how a team of external evaluators worked around these constraints on an evaluation in the education sector.

The evaluation process incorporated the use of a logic model to identify boundaries. It also featured the use of rubrics, to make evaluative judgements – their use supported robust data collection and framed analysis and reporting. The evaluation used a mixed-methods approach, which included qualitative and quantitative survey data as well as existing project data, which helped build up a rich evidential picture. Furthermore, an indigenous Māori perspective was present throughout the evaluation ensuring Māori views were heard, respected, and actioned within this mainstream project.


  • Understanding the context
  • Engaging and framing
  • Description of the process
  • Understanding causes
  • Synthesizing and valuing
  • Reporting and supporting use
  • Changes as a result of the evaluation
  • Conclusions


Oakden, J. (2013) Evaluation rubrics: how to ensure transparent and clear assessment that respects diverse lines of evidence. BetterEvaluation, Melbourne, Victoria. https://www.betterevaluation.org/sites/default/files/Evaluation%20rubrics.pdf

Average: 5 (3 votes)
Rate this Resource:
A special thanks to this page's contributors
Research Fellow, Overseas Development Institute.


There are currently no comments. Be the first to comment on this page!

Add new comment

Login Login and comment as BetterEvaluation member or simply fill out the fields below.