BetterEvaluation lists events as a service to the evaluation community.
We do not officially endorse these events unless they are noted as a BetterEvaluation event.

Claremont Evaluation Center Professional Development Workshops

This August, Claremont Evaluation Center is offering its annual Professional Development Workshops, which provide working professionals and students with world-class practical and theoretical training in evaluation and applied research. This year’s workshop series is scheduled for August 14-18, and will feature 14 seminars covering various topics in evaluation and research methods. Taught by leading academics and seasoned practitioners, this workshop series can be experienced onsite at Claremont Graduate University, or wherever you are, thanks to highly interactive online webcasts.

United States
14th August, 2019 to 18th August, 2019
Event City: 
Claremont
Event State/Province: 
California
Event cost: 
Paid
Event type: 
Workshop

Daily Schedule

  • 8:00 am–9:00 am: Check-In and Continental Breakfast
  • 9:00 am–9:15 am: Morning Welcome and Introductions
  • 9:15 am: Workshops Begin
  • 10:45 am–11:00 am: Break
  • 12:00 pm–1:30 pm: Lunch Break
  • 3:00 pm–3:15 pm: Break
  • 4:45 pm: Workshops Conclude

For those attending webcasts, please note that all times listed are in the Pacific time zone. The Morning Welcome will be webcast; webcasts begin at 9:00 am, but webcast attendees are advised to log in by 8:45 am to test their connections.

Each workshop lasts one full day, from 9:00 am to 4:45 pm. On days when there are multiple workshops, you can physically attend only one, but you may sign up for concurrent online workshops and view them afterward.

To register in advance or for online workshop attendance, visit the registration page. Registration for on-site workshop attendance is available to walk-in registrants.

2019 Workshops

Wednesday, August 14

Thursday, August 15

Friday, August 16

Saturday, August 17

Sunday, August 18

Workshop Descriptions

Wednesday, August 14

Foundations of Evaluation & Applied Research Methods
Stewart I. Donaldson and Christina A. Christie 

This workshop will provide participants with an overview of the core concepts in evaluation and applied research methods. Key topics will include the various uses, purposes, and benefits of conducting evaluations and applied research, basics of validity and design sensitivity, evaluation theory, theories of change, strengths and weaknesses of a variety of common applied research methods, and the basics of program, policy, and personnel evaluation. In addition, participants will be introduced to a range of popular evaluation approaches including the transdisciplinary approach, program theory-driven evaluation science, experimental and quasi-experimental evaluations, empowerment evaluation, inclusive evaluation, utilization-focused evaluation, developmental evaluation, and realist evaluation. This workshop is intended to provide participants with a solid introduction, overview, or refresher on the latest developments in evaluation and applied research, and to prepare participants for intermediate and advanced level workshops in the series.

Recommended background readings include:

Questions regarding this workshop may be addressed to Stewart.Donaldson@cgu.edu.

Mixed Methods in Evaluation
Tarek Azzam

This workshop aims to strengthen participants’ understanding of qualitative and quantitative research methods. This course will explore mixed methods research and describe the history and foundations of this form of research. We will then examine the types of mixed methods designs available and discuss the process of research and evaluation as it relates to various mixed method designs. Ultimately, participants will have an understanding of the various types of mixed methods designs and how they have been applied to research and evaluation.

Learning Outcomes:
1. Examine and describe mixed methods research designs.

2. Examine the historical, philosophical, and theoretical foundations for conducting mixed methods research

3. Examine the steps in mixed methods data collection used in the different types of designs.

4. Introduce the different procedures available for analyzing, mixing, and validating quantitative and qualitative data within mixed methods designs.

5. Evaluate the quality of a mixed methods studies

Questions regarding this workshop may be addressed to tarek.azzam@cgu.edu

Thursday, August 15

Applications of Correlation and Multiple Regression: Mediation, Moderation, and More
Dale E. Berger

Multiple regression is a powerful and flexible tool that has wide applications in evaluation and applied research. Regression analyses are used to describe relationships, test theories, make predictions with data from experimental or observational studies, and model complex relationships. In this workshop we’ll explore preparing data for analysis, selecting models that are appropriate to your data and research questions, running analyses including mediation and moderation, interpreting results, and presenting findings to a nontechnical audience. The presenter will demonstrate applications from start to finish with SPSS and Excel. In recognition of the fact that it is difficult to remember everything in a presentation, participants will be given detailed handouts with explanations and examples that can be used later to guide similar applications.

Level: Intermediate; participants should have some familiarity with correlation and statistical analyses.

Questions about this workshop may be addressed to Dale.Berger@cgu.edu.

Introduction to Qualitative Research Methods
Kendall Cotton Bronk

This workshop is designed to introduce you to qualitative research methods. The session will focus on how qualitative research can be effectively utilized in applied research and evaluation contexts. We’ll talk about how to devise qualitative research questions, how to select purposive samples, and what kinds of data to collect for qualitative investigations. We’ll also discuss a few approaches to analyzing qualitative findings, we’ll explore strategies for enhancing the validity of qualitative studies, and we’ll discuss the types of claims qualitative researchers can make based on their methods. Finally, we’ll dedicate time in the afternoon to addressing specific issues class participants are having with qualitative work they’re currently doing or plan to do.

Questions regarding this workshop may be addressed to Kendall.Bronk@cgu.edu.

Culturally Responsive Evaluation
Katrina L. Bledsoe

The beauty of the field of evaluation is in its potential responsiveness to the myriad of contexts in which people—and programs, policies, and the like—exist. As the meaning and construction of the word “community” expands, the manner in which evaluation is conducted must parallel that expansion. Evaluations must be less about a community and more situated and focused within the community, therefore increasing its responsiveness to the uniqueness of the setting/system. To do this, however, requires an expanded denotative and connotative meaning of community. Moreover, it requires us to think innovatively about how we construct and conduct evaluations, and to broadly consider the kind of data that will be credible to stakeholders, and consumers. The goal of this workshop is to engage the attendee in thinking innovatively about what evaluation looks like within a community, rather than simply about a community. We will engage in a process called “design thinking” (inspired by Design Innovation Consultants IDEO and Stanford’s Design School) to help us consider how we might creatively design responsive and credible community-based evaluations. This interactive course includes some necessary foundation-laying, plenty of discussion, and of course, opportunities to think broadly about how to construct evaluations with the community as the focal point.

Questions regarding this workshop may be addressed to Katrina.Bledsoe@gmail.com.

Friday, August 16

Strengths-based Evaluation
Stewart I. Donaldson

This workshop is designed to provide participants with an opportunity to increase their understanding of strengths-based evaluation, and to learn how to use this new approach to manage excessive evaluation anxiety (XEA), facilitate interpersonal harmony and success (a new AEA evaluator competency), and to improve the rigor and usefulness of evaluations conducted in dynamic and complex “real world” settings. We will examine the history and foundations of evaluating human and intervention strengths, theories of change based on stakeholder and positive psychological science theories and research, how these theories of change can be used to frame strengths-based evaluation questions and tailor evaluations, and how to gather credible and actionable evidence to improve evaluation accuracy and usefulness of strengths-based evaluations. Lecture, exercises, discussions, and a wide range of practical examples from evaluation practice will be provided to illustrate main points and key take-home messages, and to help participants to integrate these concepts into their own work immediately.

Questions regarding this workshop may be addressed to Stewart.Donaldson@cgu.edu.

Using Research to Drive Program Design and the Evaluation of Educational Interventions
Tiffany Berry and Rebecca M. Eddy

Level: Advanced beginner; participants should have prior familiarity with social science research methods, evaluation terminology and some experience in educational settings.

Educational interventions are often designed with well-intentioned goals; however, they are not always designed based on established research literature first, but rather as an afterthought. Designing effective program interventions based on research literature can be: a) efficient; b) cost-effective; and c) lead to a greater likelihood of positive program outcomes. Skilled evaluators are in a unique position to guide stakeholders to integrate research-based interventions into program design as well as ensure the design of evaluations for these interventions are appropriate. In this workshop, participants will consider how high-quality social science research literature can serve as the basis for educational program design and subsequent evaluation.

As practicing educational evaluators and academics for almost 20 years, we will share our experiences in the trenches as well as explore key issues that are important for contemporary educational evaluators to know. Using lecture, interactive activities, and shared discussion, participants will learn:

  • The importance of relying on high-quality research to both design and evaluate educational interventions.
  • The current educational policy and accountability landscape.
  • Which educational strategies have been shown to improve student learning outcomes.
  • How to integrate educational research into logic models to structure strong educational initiatives and interventions.
  • How to measure program implementation to determine if educational strategies produce measurable changes in student outcomes.
  • How to measure educational outcomes beyond traditional academic indicators, including social-emotional learning and college/career readiness.

These concepts will be explored using fun, interactive activities. The workshop is designed to engage the audience, so be ready to participate and add your voice to the mix! We will also supply a reading list to any participant who wants more evaluation resources about these concepts.

Questions regarding this workshop may be addressed to tiffany.berry@cgu.edu.

Expanding Pathways to Leadership
Michelle Bligh

There is no question that leadership profoundly affects our lives through our roles as researchers and evaluators. Organizational and programmatic successes and failures are often attributed to leadership. However, leadership is more than just a collection of tools and tips, or even skills and competencies; the essence of leadership is grounded in values, philosophies, and beliefs. In addition, pathways to leadership are complicated by the various challenges and opportunities rooted in gender, race, ethnicity, age, class, citizenship, ability, and experience.

Through the metaphor of the labyrinth, we will explore the following questions: What is effective leadership, and how can we encourage more researchers and evaluators to identify as leaders and proactive followers? How can we develop more inclusive leadership programs that allow diverse leaders to rise to the new challenges and demands of a global world? We will examine what successful 21st-century leaderships looks like, drawing on theories of philosophy and ethics, charismatic and transformational leadership, and followership. Using research, cases, and exercises, we will examine constructs critical to practicing leadership, including empowerment, authenticity, accountability, courage, influence, and humility.

Questions regarding this workshop may be addressed to Michelle.Bligh@cgu.edu.

Saturday, August 17

Survey Research Methods
Jason T. Siegel

The focus of this hands-on workshop is to instruct attendees how to create reliable and valid surveys to be used in applied research. A bad survey is very easy to create. Creating an effective survey requires a complete understanding of the impact that item wording, question ordering, and survey design can have on a research effort. Only through adequate training can a good survey be discriminated from the bad. The daylong workshop will focus specifically on these three aspects of survey creation. The day will begin with a discussion of Dillman’s (2007) principles of question writing. After a brief lecture, attendees will then be asked to use their newly gained knowledge to critique the item writing of selected national surveys. Next, attendees will work in groups to create survey items of their own. Using Sudman, Bradburn, and Schwatrz’s (1996) cognitive approach, attendees will then be informed of the various ways question order can bias results. As practice, attendees will work in groups to critique the item ordering from selected national surveys. Next, attendees will propose an ordering scheme for the questions created during the previous exercise. Lastly, using several sources, the keys to optimal survey design will be provided. As practice, the design of national surveys will be critiqued. Attendees will then work with the survey items created, and properly ordered, in class and propose a survey design.

Questions regarding this workshop may be addressed to Jason.Siegel@cgu.edu.

Introduction to Grant Writing
Allen M. Omoto

This workshop covers some of the essential skills and strategies needed to prepare successful grant applications for education, research, and/or program funding. It will provide participants with tools to help them conceptualize and plan research or program grants, offer ideas about where to seek funding, and provide suggestions for writing and submitting applications. Some of the topics covered in the workshop include the pros and cons of grant-supported work, strategies for identifying sources of funding, the components and preparation of grant proposals, and the peer review process. Additional topics related to assembling a research or program team, constructing a project budget, grants management, and tips for effective writing also will be covered. The workshop is intended primarily as an introduction to grant writing, and will be most useful for new or relatively inexperienced grant writers. Workshop participants are invited to bring their own “works in progress” for comment and sharing. There will be limited opportunities for hands on work and practice during the workshop. At its conclusion, workshop participants should be well positioned to read and evaluate grant applications, as well as to assist with the preparation of applications and to prepare and submit their own applications in support of education, research, or program planning and development activities.

Questions regarding this workshop may be addressed to Allen.Omoto@cgu.edu.

Improving Performance Monitoring for Social Betterment
Leslie Fierro

Frequently in the field of evaluation individuals refer to “M&E”—the short-form for “monitoring and evaluation.” The field of evaluation in academic and professional development contexts often focuses on the “E” part of this equation. However, in addition to performing evaluation, many programs that aim to improve wellbeing and strive for social betterment are required to report program performance metrics to funders and/or see value and utility in gathering data for their own learning purposes that can be obtained quickly, analyzed easily, and summarized at a glance. This is the realm of performance monitoring (aka performance measurement). In this workshop, we will briefly cover the history of performance monitoring (compared to evaluation) and discuss some of the debates (and empirical literature) regarding the strengths and limitations of this technique. The bulk of the workshop will focus on considerations and techniques for designing good performance measures and increasing the reliability and validity of data collection and reporting strategies. Finally, we will compare and contrast performance monitoring with evaluation and consider where important synergies exist between these two performance improvement approaches.

Questions regarding this workshop may be addressed to leslie.fierro@cgu.edu.

Sunday, August 18

The Science of Well-being: Theory, Research, Methods, & Applications
Stewart I. Donaldson and Saeideh (Saida) Heshmati

Since its formal introduction at the American Psychological Association Convention in 1998, the positive psychology movement has blossomed, giving birth to a vibrant community of scholars and practitioners interested in understanding and improving various aspects of individual, social, organizational, community, and societal well-being.

In this full-day workshop, you will be provided with an overview of the positive psychology movement, and its relationship with new directions in the science of well-being. Through lectures, small group discussions, exercises, and cases, you will learn about the latest theories, research, methods, and theory-driven applications of well-being science.  This emerging body of scientific knowledge can help you learn how to enhance your own well-being, the well-being of your loved ones, and how to improve the lives of underserved and disadvantaged populations often experiencing a range of social injustices that prevent them from flourishing.

Recommended reading:

  • Donaldson, S. I., Dollwet, M., & Rao, M. (2015). “Happiness, excellence, and optimal human functioning revisited: Examining the peer-reviewed literature linked to positive psychology.”Journal of Positive Psychology, 9(6), 1–11.
  • Rao, M., and Donaldson, S. I. (2015). Expanding opportunities for diverse populations in positive psychology: An examination of gender, race, and ethnicity.Canadian Psychology/Psychologie, 56(3), 271–282. (Special issue on Positive Psychology)
  • Kim, H., Doiron, K, Warren, M. A., & Donaldson, S. I. (2018). The international landscape of positive psychology research: A systematic review.International Journal of Well-Being, 8(1), 50-70.
  • Warren, M. A., & Donaldson, S. I. (2017). Scientific advances in positive psychology. Westport, Connecticut: Praeger.

Please contact Stewart.Donaldson@cgu.edu or Saida.Heshmati@cgu.edu if you have questions.

Quasi-Experimental Design
William D. Crano

Conducting, interpreting, and evaluating research are important aspects of the evaluator’s job description. To that end, many good educational programs provide opportunities for training and experience in conducting and evaluating true experiments (or randomized controlled trials—RCTs—as they sometimes are called). In applied contexts, the opportunity to conduct RCTs often is limited, despite the strong demands on the researcher/evaluator to render “causal” explanations of results, as they lead to more precise understanding and control of outcomes. In these restricted contexts, which are considerably more common than those supporting RCTs, quasi-experimental designs may prove useful. Though they usually do not support causal explanations (with some noteworthy exceptions), they sometimes provide evidence that helps reduce the range of plausible alternative explanations of results, and thus, can prove to be of real value. This workshop is designed to impart an understanding of quasi-experimental designs. After some introductory foundational discussion focused on “true” experiments, we will consider quasi-experimental designs that may be useful across a range of settings that do not readily admit to experimentation. These designs will include time series and interrupted time series methods, nonrandomized designs with and without control groups, case control (or ex post facto) designs, regression-discontinuity analysis, and other esoterica. Participants are encouraged to bring to the workshop design issues they are facing in real world contexts.

Questions regarding this workshop may be addressed to William.Crano@cgu.edu.

Practical Approaches for Evaluating Community and International Development Programs
Huey T. Chen

This workshop will illustrate evaluation approaches for community program and international development programs which are often dynamic and complex. Many traditional evaluation approaches assert linear progression from planning through implementation and to the outcome phase. These approaches, however, does not reflect the reality of community programs or international development endeavors. This workshop will introduce evaluation approaches that are proactive and dynamic with a clear purpose of providing timely and/or insightful information to stakeholders, for improving the program resilience and sustainability in the community.

The workshop will provide training about two types of practical approaches based on the evaluation purpose: (1) to provide rapid feedback for program improvement; and (2) to serve accountability to funding agencies and program improvement needs to stakeholders. Instructional methods will include lecture, exercises, discussions, and practical examples, and key take-home messages.  Participants will be provided with knowledge that will help them integrate these concepts and methods into their own work.

Recommended background readings include:

Chen, HT, Morosanu, L, Bellury, N, Kimble, L. (2019) “Multi-Wave Formative Evaluation of a Retention Program for Minority Nursing Students: Intended Effects, Unintended Consequences, and Remedial Actions” American Journal of Evaluation, https://journals.sagepub.com/eprint/n2FAC9FJ5GHIpNJszcWC/full.

Chen, HT (2015) Practical Program Evaluation: Theory-Driven Evaluation and the Integrated Evaluation Perspective. (2nd ed.)  Thousand Oaks, CA: Sage.