Search

Primary tabs

Show search hints

Search results

  1. Introduction to Qualitative Research Methods (Summer 2011 Professional Development Workshops Series of Claremont Graduate University)

    Event
    19th August, 2011
    United States

    Facilitator:  Maritza Salazar This workshop aims to provide the participants basic skills needed to choose appropriate qualitative data collection options according to their need. To achieve that,  it intends to introduce different types of qualitative research options and underlying theoretical paradigms with a particular emphasis on how they can be used in applied research and evaluation and consulting setting.

  2. Evaluation Theories: Introduction and Application (Summer 2011 Professional Development Workshops Series of Claremont Graduate University)

    Event
    19th August, 2011
    United States

    Facilitator:  Melvin Mark? This session intends to provide an overview of a range of evaluation theories or models/approaches and their application. The basic purpose of this workshop is to enable participants to understand: the potential benefits (and costs) of using evaluation theory; key differences across alternative evaluation theories; the nature and practice implications of a small set of notable evaluation theories; and how evaluation theory can be used to improve evaluation practice.

  3. Developmental Evaluation: Applying Systems Thinking and Complexity Concepts to Enhance Innovation and Use (Summer 2011 Professional Development Workshops Series of Claremont Graduate University)

    Event
    21st August, 2011
    United States

    Facilitator:  Michael Quinn Patton This Workshop aims to provide an introduction to the unique niche and purpose of developmental evaluation. Developmental evaluation supports innovation by providing real time feedback about the emergent and dynamic realities resulting from the intervention.

  4. Integrated Data Analysis in Mixed Methods Inquiry (Summer 2011 Professional Development Workshops Series of Claremont Graduate University)

    Event
    21st August, 2011
    United States

    Facilitator:  Jennifer Greene This one-day workshop aims to introduce the participants to the mixed options evaluation. The central challenge for mixed options evaluation, i.e. integration and connection of one data set to another, is the focus of this workshop. The unique purposes and characteristics of mixed options evaluation, and alternative designs that accompany each mixed option will be discussed followed by an elaboration of mixed options data analyses.

  5. Practical Meta-Analysis: (Summer 2011 Professional Development Workshops Series of Claremont Graduate University)

    Event
    21st August, 2011
    United States

    Facilitator:  Becky Reichard Evidence-based practice requires the most up-to-date and accurate understanding of existing research. ‘Meta-analysis’, is a tool for synthesizing and understanding existing empirical research. Meta-analysis enables the researchers to estimate the effect size of the phenomena through a series of statistical adjustments and corrections etc. and therefore provide a basis of evidence for practice. ??With a purpose to provide a practical and clear presentation of meta-analysis, the workshop is to focus on the basic steps in meta-analysis.

  6. Evaluation in Developing Countries: (Summer 2011 Professional Development Workshops Series of Claremont Graduate University)

    Event
    22nd August, 2011
    United States

    Facilitators:  Zenda Ofir?? Evaluators working in developing countries carries a major responsibility to ensure that they not only ‘do no harm’, but that their evaluation practice supports ‘real’ development on a micro, meso and/or macro scale. Those working in such situations need to be clear about their values towards development work and these values should reflect into their evaluation practices as well.

  7. Psychology of Evaluation (Summer 2011 Professional Development Workshops Series of Claremont Graduate University)

    Event
    22nd August, 2011
    United States

    Facilitators:  Michael Scriven and Stewart I. Donaldson Some evaluation ‘models’ already embody (claimed or real) psychological insights—empowerment and appreciative inquiry come to mind—but others might benefit from incorporating them. Here we will lay out some of the psychological dimensions of evaluation and encourage a discussion of their application to evaluation practice.

  8. Advancing Evaluation thorough Enhancing Viable, Effectual, and Transferable Validity: The Theory-Driven Evaluation Perspective (Summer 2011 Professional Development Workshops Series of Claremont Graduate University)

    Event
    22nd August, 2011
    United States

    Facilitators:  Huey-Tsyh Chen To satisfy the Stakeholders’ concern with internal and external validity (viable, effectual, and transferable validity) Chen discusses an integrated evaluation approach developed from the theory-driven evaluation perspective.

  9. Cleaning Up Your Act for Showtime!  How to Prepare Data for Analysis and Display (Summer 2011 Professional Development Workshops Series of Claremont Graduate University)

    Event
    22nd August, 2011
    United States

    Facilitators:  Dale Berger? Statistical results based on problematic data can be a costly error and an embarrassment. To avoid errors in statistical results, it is important to deal with problematic data before starting analysis. This workshop is focused on demonstrations of the consequences of ignoring problems with data followed by diagnostics and remedies, along with principles that can guide our choices to deal with such data set.

  10. Basics of Evaluation & Applied Research Methods (Summer 2011 Professional Development Workshops Series of Claremont Graduate University)

    Event
    19th August, 2011
    United States

    Facilitators:  Stewart I. Donaldson & Christina A. Christie This workshop aims to prepare participants for intermediate and advance level workshops of the series by providing an overview on the latest developments in evaluation and applied research. Key topics will include the various uses, purposes, and benefits of conducting evaluations and applied research, basics of validity and design sensitivity, strengths and weaknesses of a variety of common applied research options, and the basics of program, policy, and personnel evaluation.

Pages