1 / 34

Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice

Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice. Monica Hargraves, PhD Manager of Evaluation for Extension and Outreach Cornell University October 2009 mjh51@cornell.edu. Brief outline. What we do (The “Evaluation Partnership” approach)

ronald
Download Presentation

Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Building Evaluation Capacity:The “Systems Evaluation Protocol” in Practice Monica Hargraves, PhD Manager of Evaluation for Extension and Outreach Cornell University October 2009 mjh51@cornell.edu

  2. Brief outline • What we do (The “Evaluation Partnership” approach) • Key steps in the training (with stories from extension program partners) • “Swimming against the tide…” • Making it feasible and sustainable

  3. Evaluation Partnerships Evaluation Partnerships • CORE provides training, brings evaluation expertise • Partners bring experience, expertise in their programs, their communities, their “systems” Planning Phase is a one-year commitment, with intentions and clarity of roles captured in an MOU

  4. What the EP entails, in the “planning year” Stages: • Preparation for Partnership (Jan – March) • Modeling (intensive!) (April – June) • Evaluation Planning (July – Oct/Nov) Formats this year: • Two in-person, full-day training meetings • Web-conferences • Listserve, e-mail, phone support

  5. History of the Project within Cornell Cooperative Extension (CCE) 2006: NYC

  6. History of the Project within Cornell Cooperative Extension (CCE) 2007: Chenango, Jefferson, Onondaga, St. Lawrence, Tompkins, Ulster

  7. History of the Project within Cornell Cooperative Extension (CCE) 2009: Chemung, Chenango, Clinton, Cortland, Franklin, Fulton & Montgomery, Genesee, Jefferson, Madison, Monroe, Oneida, Ontario, Oswego, Rensselaer, Saratoga, Seneca, Tioga, Tompkins, Ulster, Wayne

  8. Evaluation Phases

  9. “Systems Evaluation Protocol”: Planning Phase

  10. Stakeholder Analysis Other Youth Programs NYS 4-H Cornell University SUNY Morrisville Cobleskill Local School Districts Taxpayers CCE Staff Youth Funders JCADCA Dairy Program Volunteers Parents NYS Jr. Holstein Association Local Ag Businesses FFA Teachers Breed Associations CCE Board of Directors National Dairy Industry 4-H Members Jefferson County Fair Board Surrounding County Youth Jefferson County Dairy Producers State Fair Jefferson County Legislatures Media CCE-Jefferson 4-H Dairy Program: Stakeholder Map

  11. Developing Stakeholder charts

  12. Stakeholder Analysis … why it matters:

  13. Logic Model Development

  14. Quick “poll” on formal modeling … Think of programs you are evaluating, or wish to evaluate. How many of those have a written-down model (Logic Model, or something similar)? A – all B – many C – some D – few E – none

  15. Focus on Activities, Outputs, and Outcomes

  16. Make connections (create “links”)

  17. Pathway Model Development 4-H “SET-To-Go” (an after-school science program), CCE-Cortland County Pathway Model, October 2009

  18. “Mining the Model”

  19. Comments from an Evaluation Partner… Shawn Smith 4-H Issue Area Leader & Evaluation Project Manager CCE – Cortland County (CCECC)

  20. CCECC Advisory Committee Input

  21. Internal Stakeholder Analyses

  22. CCECC PM “Final” To date

  23. Note: You can do Pathway model visuals without the Netway!

  24. Program Life Cycle Impact transformation Time initiation maturity growth Source: Program Leadership Certification, “Accountability and Evaluation” PowerPoint, Michael Duttweiler

  25. Quick Poll on Program Lifecycles Think about a program you are evaluating or are going to be evaluating What lifecycle stage is it in? A – early development, pilot B – still revising/tweaking C – implemented consistently D –consistent across sites/facilitators and documented E –well-established, stable, candidate for replication

  26. Program & Evaluation Alignment Program Lifecycle Evaluation Lifecycle Is program in initial implementation(s)? Process assessment and post-only evaluation of participant reactions and satisfaction. Phase IA Process & Response Phase I Initiation Post-only assessment of outcomes, implementation assessment, outcome measurement development and assessment of internal consistency (reliability). Is program in revision or reimplementation? Phase IB Unmatched pretest and posttest of outcomes, qualitative assessment of change, and assessment of reliability and validity of measurement. Is program being implemented consistently? Phase IIA Phase II Development Change Does program have formal written procedures/protocol? Matched pretest and posttest of outcomes. Verify reliability and validity of change. Human subjects review. Phase IIB Controls and comparisons (control groups, control variables or statistical controls). Is program associated with change in outcomes? Phase IIIA Comparison & Control Phase III Maturity Controlled experiments or quasi-experiments (randomized experiment; regression-discontinuity) for assessing the program effectiveness. Does program have evidence of effectiveness? Phase IIIB Evaluation Special Projects Multi-site analysis of integrated large data sets over multiple waves of program implementation. Is effective program being implemented in multiple-sites? Phase IVA Phase IV Generalizability Dissemination Formal assessment across multiple program implementations that enable general assertions about this program in a wide variety of contexts (e.g., meta-analysis). Is evidence-based program being widely distributed? Phase IVB

  27. Determining Evaluation Scope • It’s all about making GOOD CHOICES… • What kind of evaluation is appropriate for the program lifecycle stage? • What are the key outcomes this program should be attaining? • What do important stakeholders care most about? • What will “work best” inthis kind of program? • What kind of evaluation is feasible for this year? What should wait until a future year?

  28. Activity Activity Activity Activity Short-Term Outcome Short-Term Outcome Short-Term Outcome Determining Evaluation Scope Components 1 1 Key Outcomes Key Links Output Output Output Output 1 Key Pathway Stakeholders 1 2 2 3 3 Internal Priorities 2 Scope Middle-Term Outcome Middle-Term Outcome Middle-Term Outcome Middle-Term Outcome 3 2 Long-Term Outcome Long-Term Outcome

  29. Comments from another Evaluation Partner… Linda Schoffel Rural Youth Services Program Coordinator CCE – Tompkins County (CCETC)

  30. Using the Pathway Model for making evaluation choices – RYS Rocketry Program

  31. Evaluation should “fit” the program …(lifecycle, stakeholders, context, etc.) Do youth who participate in RYS Rocketry feel like they are part of the group? (belonging) RYS Rocketry Program, CCE-Tompkins County Pathway Model, October 2009

  32. “Swimming against the Tide” The most frequently cited challenge to program evaluation is lack of time. The systems approach involves spending a lot of time before you even get to the point of choosing measures… Programs often face significant pressure for more evaluation, and for evidence of “impact” … The systems approach argues, essentially, that “less is more” if the evaluation truly “fits” the program

  33. Making it feasible for the long-term • Key ingredients that help: • focusing on the most valuable elements (choosing well) • identifying interim benefits of the process • integrating with other needs • building on others’ progress • sharing resources

  34. Wrapping Up … Thank you! Any questions for any of us, before returning to Bill…? For follow-up questions later, Monica Hargraves: mjh51@cornell.edu Shawn Smith: scs239@cornell.edu Linda Schoffel: ljs48@cornell.edu Also see our website at http://core.human.cornell.edu/

More Related