1 / 14

Reflect and Revise: Evaluative Thinking for Program Success

Reflect and Revise: Evaluative Thinking for Program Success. Tom DeCaigny and Leah Goldstein Moses. Topics to guide our discussion. Benefits of evaluation and assessment to staff, program participants and the organization as a whole Building on existing knowledge and resources for evaluation

jihan
Download Presentation

Reflect and Revise: Evaluative Thinking for Program Success

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Reflect and Revise: Evaluative Thinking for Program Success Tom DeCaigny and Leah Goldstein Moses

  2. Topics to guide our discussion • Benefits of evaluation and assessment to staff, program participants and the organization as a whole • Building on existing knowledge and resources for evaluation • Launching evaluation and assessment efforts • Choosing the right methods of assessment and evaluation • Learning from other organizations to integrate evaluative thinking and use evaluation to support programs

  3. Let’s get to know each other • Your presenters: Leah Goldstein Moses and Tom DeCaigny • How comfortable are you with evaluation? • What comes to mind when someone asks you to evaluate and/or assess your work? • What are your challenges with evaluation?

  4. First, some definitions • Assessment: The act of determining the standing of an object on some variable of interest, for example, testing students and reporting scores. • Evaluation: Systematic investigation of the worth or merit of an object; e.g., a program, project, or instructional material. • Source Joint Committee on Standards for Educational Evaluation. (1994). The Program Evaluation Standards, 2nd ed. Thousand Oaks, CA: Sage. (used with permission of publisher)

  5. Benefits of evaluation and assessment • Be accountable to important stakeholders • Professional and organizational development – learn how you are doing • Program management – see where your programs need continued support or improvement • Investigation and learning • Feeds curiosity and fosters innovation

  6. Stories of evaluation benefits • Utilizing evaluation results to identify a need and develop a new project (special needs - ARISE case study) • Utilizing evaluation results to improve program quality and design (teaching artist training case study) • Evaluation and assessment as part of reflective artistic practice (A Cycle of Artistic Inquiry case study)

  7. A Cycle of Artistic InquiryPerforming Arts Workshop and Dr. Richard Siegesmund (2000)

  8. Ensuring evaluation benefits are shared • Evaluation, at its best, is “engaged in”, not “done to”. • When developing or improving evaluation systems, think about who will be doing the work for the evaluation (distributing surveys, gathering information, analyzing the data): • Is there a way to decrease the burden? • Is there a way to provide benefits? • Examples of evaluation efficiency and incentives

  9. Build on existing knowledge and resources • Internal insights can really support a new evaluation effort. Determine: • What do we collect already? • What does the information we already have tell us? • What can we report on just from our own internal record keeping or observations? • What can we gather in the course of our work – during existing programs, contact, etc. • Example in ARISE: student achievement – test scores..

  10. Build on existing knowledge and resources • Use external information, such as reports done by organizations you admire: • What did they study? • How did they gather information? • Can you apply any of their tools or methods? • Can you infer/generalize anything from their findings so you don’t have to replicate their effort?

  11. Getting started in evaluation and assessment • Logic models are incredibly useful. They help you: • Determine how your efforts are related to your expected impact • Map out what you want to measure, and why • You can determine what data you already have and what you are lacking during the logic model process

  12. Getting started in evaluation and assessment • After you have created a logic model and/or identified data gaps, you can determine what you are going to collect, when, and in what format • Surveys are great; but in the arts, you might want to use an artistic process or other valid alternative assessments • Illustrative rubrics • Observation • Portfolios

  13. Learning from others’ experiences • Notes from our discussion: • What evaluation approaches have worked well for you? Electronic portfolios in classroom. Reflects project-based learning. Can see progression over course of year. Parents, administrators can also see. Time to reflect can be challenging but is important. “Level Best” is a good resource. Tried to find things that existed and could be used in the organization. Festin “Theater Communications Group” is a good resource. Anecdotal information, journals can have a bigger impact on Boards and other audiences that don’t care much for quantitative. Site visits for Board members are required as part of their responsibilities. Having tools at your fingertips – did anything good happen today? Did any challenges happen? Right at time they are needed. Using incentives – crayola pencils were good for parents. • Where have you struggled? Finding time for reflection. Having right tool for evaluation. Logic models- can be cumbersome or difficult to use. Having a way to capture, understand and communicate unexpected outcomes. Avoiding bias through body language, tone – need to make sure to encourage honesty. Board can ignore quantitative. Finding sophistication/depth in questions has been hard when they are in a survey.

  14. Our contact information Tom DeCaigny, Executive Director, Performing Arts Workshop T:  (415) 673-2634 x207 / F:  (415) 776-3644 E: tom@PerformingArtsWorkshop.org www.PerformingArtsWorkshop.org Leah Goldstein Moses, President, The Improve Group T:  (877) 467-7847 x11 / F:  (612) 656-1731 E: leah@theimprovegroup.com www.theimprovegroup.com

More Related