Search

Primary tabs

Show search hints
Did you mean
results main

Search results

  1. AES Workshop: Performance Story Reports

    Event
    Workshop
    23rd March, 2017
    Australia
    Paid

    Performance story reports aim to strike a good balance between depth of information and brevity. They aim to be written in accessible language and help build a credible case about the contribution a program has made towards outcomes or targets. They help teams and organisations to focus on results and also provide a common language for discussing different programs. This workshop will explore different approaches to performance story, and how performance story reports are developed. It will outline steps to building a report and explore the role of program logic and evidence in developing the report. It will be an interactive and engaging workshop involving case studies and group process.

  2. What does it mean to ‘un-box’ evaluation?

    Blog
    30th January, 2019

    This guest blog by Jade Maloney is the first in a series about un-boxing evaluation – the theme of aes19 in Sydney, Australia. The series is designed to generate a global discussion of the theme ‘un-boxing evaluation’ and what that means for our profession and practice. Jade Maloney is co-convenor of aes19. She is also a Partner at ARTD Consultants, specialising in design and evaluation with people with disability and in the disability sector.  

  3. I'm doing an impact evaluation: What evidence do I need? (#AES17 presentation slides)

    Resource
    Overview
    2017

    Are quantitative or qualitative methods better for undertaking impact evaluations? What about true experiments? Is contribution analysis the new 'state of the art' in impact evaluation or should I just do a survey and use statistical methods to create comparison groups?

    Determining one's plan for an impact evaluation occurs within the constraints of a specific context. Since method choices must always be context specific, debates in the professional literature about impact methods can at best only provide partial guidance to evaluation practitioners. The way to break out of this methods impasse is by focusing on the evidentiary requirements for assessing casual impacts.

  4. Shift your practice to advance evaluation in our changing world

    Event
    Workshop
    23rd July, 2020 to 27th July, 2020
    Online
    Paid

    Facilitated by Samantha Abbato, Liz Smith and Sandar Duckworth.  2.00pm to 3.30pm AEST  

  5. AES Workshop: Developing Monitoring and Evaluation Frameworks

    Event
    Workshop
    20th March, 2017 to 21st March, 2017
    Australia
    Paid

    This workshop follows the structure of the text book ‘Developing Monitoring and Evaluation Frameworks’ authored by Dr Ian Patrick and Anne Markiewicz. It will present a clear and staged conceptual model for the systematic development of an M&E Framework. It will examine a range of steps and techniques involved in the design and implementation of the framework; explore potential design issues and implementation barriers; cover the development of a Program Logic; the identification of key evaluation questions; the development of performance indicators; and identification of processes for data collection, on-going analysis and reflection based on data generated. The facilitator will encourage interactive peer to peer dialogue to share experiences and learning, and also draw on case studies to encourage application of knowledge and skills to evaluation contexts.

  6. Designing and Implementing a Monitoring and Evaluation System

    Event
    Workshop
    5th March, 2018 to 8th March, 2018
    Australia
    Paid

    This workshop draws on the textbook ‘Developing Monitoring and Evaluation Frameworks’ (SAGE, 2016) authored by Anne Markiewicz and Ian Patrick. It presents a clear and staged conceptual model for the systematic development and implementation of an M&E System. 

  7. AES Workshop: What does the evaluator mean? Building the skills to understand evaluation reports and apply the findings to policy and practice

    Event
    Workshop
    21st May, 2015
    Australia
    Paid

    Evidence-based practice depends on skilled practitioners and policy makers to be able to extract and interpret and apply key information from evaluation reports. However, understanding evaluation reports, their limitations and language – and then making good use of the findings are essential skills in order to benefit from the investment in the evaluation process. This workshop provides the link between the evaluator and the evaluation user: users will learn what to look for and what questions to ask and evaluators will learn how to make their reports more accessible.

  8. Designing and Implementing a Monitoring and Evaluation System workshop (30 Jul, 31 Jul, 1 Aug, 2 Aug 2018 Sydney)

    Event
    Workshop
    30th July, 2018 to 2nd August, 2018
    Australia
    Paid

    This workshop draws on the text book ‘Developing Monitoring and Evaluation Frameworks’ (SAGE, 2016) authored by Anne Markiewicz and Ian Patrick. It presents a clear and staged conceptual model for the systematic development and implementation of an M&E System. The workshop has been developed with four separate, but inter-related components, with one presented each day. People can choose to participate in the full program or part of the program dependent upon their experience and needs.