1 / 18

Program Evaluation

Program Evaluation. Debra Spielmaker, PhD Utah State University School of Applied Sciences, Technology & Education - Graduate Program Advisor USDA-NIFA, Agriculture in the Classroom - Project Director.

talbot
Download Presentation

Program Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Program Evaluation Debra Spielmaker, PhD Utah State University School of Applied Sciences, Technology & Education - Graduate Program Advisor USDA-NIFA, Agriculture in the Classroom - Project Director

  2. The Southern Region wants to know “How do I Prove the Value of My Program to Teachers and Industry Partners?” • We’re already keeping track of the numbers of teachers and students reached, but we would like to know the following: • Do teachers continue to use AITC materials after they leave our workshops? • How are teachers using AITC materials in their classrooms? • What concepts are teachers covering with AITC materials received at workshops or ordered online from our websites?

  3. Using program evaluation to answer questions • What is the purpose of a program evaluation? • To gain insights or to determine necessary inputs • To find areas in need of improvement to change practices • To assess program effectiveness • Conduct a cost analysis • Determine sustainability • What’s a program? Different from project? • Who cares about the data? • What difference will in make?

  4. Who are the stakeholders and who are boundary partners? • Stakeholders: People who have a stake of a vested interest in the program, policy or product being evaluated, and also have a stake in the evaluation (Mertens & Wilson, 2012, p. 562). • Boundary Partners: The individuals, groups, or organizations with whom the program works directly (Buskens & Early, 2008, p. 174).

  5. What is the role of stakeholders in program evaluation? • Stakeholders help to build a program theory or how the program should work. • Stakeholders identify the elements they believe are necessary to achieve their desired results. • They help to build a theory-based model (logic model, log frame, or program theory model) with specific outcomes. • Help with outcome mapping: • program’s vision (identifying the target population) • identifying boundary partners • identifying available inputs • prioritizing outcomes • outline how program initiatives will be evaluated

  6. Logic Models • A program theory about how outputs (accomplishments) or interventions work to achieve outcomes (impacts). • Elements of a logic model: • Situation • Audience • Inputs (resources) • Outputs (activities or delivery) • Outcomes (short and long term) • Impact

  7. W. K. Kellogg Foundation (2004)

  8. W. K. Kellogg Foundation (2004)

  9. W. K. Kellogg Foundation (2004)*

  10. Evaluating with a Logic Model W. K. Kellogg Foundation (2004)

  11. W. K. Kellogg Foundation (2004) So with this knowledge…let’s evaluate a few logic models.

  12. Methods - Evaluation Designs • Pre-Post Designs • Experimental (random) • Quasi-experimental designs (selected on criteria) • Ex post facto designs (typically summative evaluations) Strengths and Weaknesses*

  13. Data Collection • Who is the target population • What outcomes will be measured and how will they be measured (knowledge, behaviors, attitudes, skills)? • What types of measures? • Perceptions: Self-reported survey assessments, concept maps • Content Knowledge: Tests • Inventories • Designing an Instrument (see wiki) • Logistical Requirements • Time • Money • Expertise • Access • Data Collection Tools (see wiki)

  14. Analysis • Conducting the Analysis • Stats Primer on Means, SD, statistical significance, and effect sizes (see wiki) • How will you use the results? • What are your performance targets? • Reporting

  15. “How do I Prove the Value of My Program to Teachers and Industry Partners?” • Do teachers continue to use AITC materials after they leave our workshops? • How are teachers using AITC materials in their classrooms? • What concepts are teachers covering with AITC materials received at workshops or ordered online from our websites? Method, Data Collection, Analysis*

  16. Courses for credit can assess all three questions, as a requirement • The Instructional Hours Reflection requires the following information (Ex post facto) • Number of classroom instructional hours spent on this lesson • Number of students in the classroom • strength of the lesson and/or improvement suggestions • additional classroom activities conducted and additional classroom resources used and teaching strategies or methods used to deliver this lesson • a minimum of three photographs of the activities, student work, or additional electronic files created to support the lesson, e.g. worksheets, PowerPoint, reading guides, etc.

  17. A final Strategy Report: End of a credit course or at the end of a professional development workshop • Outline your strategy for implementing the Agriculture in the Classroom concepts, lesson plans, and activities into your classroom in the future. Your response should include specifics about what lessons, activities, teaching and instructional strategies, and other integration tactics you plan to use in your curriculum during the next year.

  18. Epilogue • Is there a way to develop uniform questions and a uniform way electronically that we can follow up with teachers involved in our programs?

More Related