1 / 25

Measuring program success WDE Fall School Improvement Conference Tuesday, September 24, 2013

Measuring program success WDE Fall School Improvement Conference Tuesday, September 24, 2013 Lauren Amos, NDTAC TA Liaison, American Institutes for Research. About NDTAC. Funded by the U.S. Department of Education (ED) Operated by the American Institutes for Research (AIR) Our mission:

ayame
Download Presentation

Measuring program success WDE Fall School Improvement Conference Tuesday, September 24, 2013

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Measuring program success WDE Fall School Improvement Conference Tuesday, September 24, 2013 Lauren Amos, NDTAC TA Liaison, American Institutes for Research

  2. About NDTAC • Funded by the U.S. Department of Education (ED) • Operated by the American Institutes for Research (AIR) • Our mission: • Develop a uniform evaluation model for State Education Agency (SEA) Title I, Part D, programs • Provide technical assistance (TA) to States in order to increase their capacity for data collection and their ability to use that data to improve educational programming for N & D youth • Serve as a facilitator between different organizations, agencies, and interest groups that work with youth in neglected and delinquent facilities

  3. Session Agenda Setting the Stage Program Evaluation Methods Selecting Performance Indicators Powerful Program Evaluation Tools Cost-Saving Strategies Other Program Evaluation Resources

  4. Setting the Stage • In what type of educational settings are you working? • What program evaluation strategies are working well in your settings? • What kind of data are you currently using to evaluate your programs? • How have you used data for program quality improvement? • What questions, challenges and/or concerns bring you to this session?

  5. Program Evaluation Methods • Three methods for assessing program performance: • Program (or compliance) monitoring • Performance measurement (or implementation evaluation) • Collecting information to measure a program’s progress towards its goals and objectives and/or the quality of program implementation to inform on-going program improvement • Approaches: internal evaluation or independent external evaluation • Impact evaluation • Assesses program how effectively a program is achieving its goals, uses methods to determine whether program outcomes are attributable to the program or other factors • Approaches: internal evaluation or independent external evaluation

  6. How do you decide which method to use? • Which method you choose depends on what kinds of questions you want to be able to answer, for example… • Program Monitoring • Are all core instructors “highly qualified”? • Performance Measurement • Did the program meet its behavior modification goals for the year? • Did the program successfully transition more students into local neighborhood schools upon release? • Impact Evaluation • Did the new literacy program implemented last year have a greater effect on student reading scores than the previous program?

  7. Which evaluation method would be best? How satisfied were students with program activities? What benefits did teachers gain from the professional development opportunities? Can the program model be replicated in other parts of the state with the same likelihood of success? Who was served by the program and do the characteristics of these individuals match the target population? What suggestions do stakeholders have to improve the program? Did recidivism rates decrease as a result of the program? What resources, policies, or procedures changed or were developed as a result of the program? Did the program deliver activities specified in the funding contract? Did the program cause the reduction in substance abuse?

  8. Selecting Performance Indicators Leading Indicators are outputs and short-term outcomes: • Demonstrate signs of growth or change in a given direction suggesting early wins and areas of improvement • Provide an early read on progress towards long-term outcomes • Measure conditions that are prerequisite to the desired outcomes (i.e., predict lagging indicators) Lagging Indicators are long-term or desired outcomes: • Measure the success and consequences of activities that have already occurred • Measure achievement of the desired outcomes

  9. Leading and Lagging Indicators On the next slide, identify the leading indicators and their corresponding lagging indicators

  10. Leading and Lagging Indicator • Teacher turnover rate • Number of youth who earn a CTE certificate • Course completion rate • Number of youth who begin a technical trade while in aftercare • Number of disciplinary incidents • Hours of professional development

  11. Selecting Performance Indicators What kind of indicator are each of the following and why? Any caveats? • Graduation rate • Enrollment rate • GED enrollment rate • Number of CTE certificates awarded • Number of CTE certificates earned • Recidivism rate • Types of CTE courses offered • Number of CTE courses offered • Per pupil spending • Number of youth served • Percentage of HQT by FTE • Bed count • High school transcript • Average SAT/ACT score • Course completion rate

  12. Powerful Program Evaluation Tools • Evaluation Plans • Evaluation Plan at a Glance (see sample) • Evaluation Frameworks • Conceptual Frameworks and Models • Logic Models • Data Collection tools • Surveys • Observation Checklists/Rubrics (see sample) • Document Review Checklists/Rubrics (see sample) • Other Program Evaluation Resources

  13. Evaluation Planning • Define the problem(s) or need(s) • Set goals and objectives • Outline programming • Is it evidence-based? • Does it address the identified problems and needs? • Is it aligned with goals and objectives? • Identify measures • Develop a logic model • Collect and analyze data • Report findings • Revisit the logic model

  14. Evaluation Frameworks • What aspects of your program do you want to measure? DON’T REINVENT THE WHEEL! • Conditions for Learning: NDTAC Brief: Improving Conditions for Learning for Youth Who Are Neglected or Delinquent • Transitional services:Transition Toolkit 2.0: Meeting the Educational Needs of Youth Exposed to the Juvenile Justice System • Literacy teaching and learning: NDTAC Guide: Meeting the Literacy Needs of Students in Juvenile Justice Facilities • Mathematics teaching and learning: NDTAC Guide: Making It Count: Strategies for Improving Mathematics Instruction for Students in Short-Term Facilities

  15. CJJR White Paper

  16. Content of CJJR White Paper

  17. Individual Academic and Behavioral Approaches Strategy Guide

  18. Education Across Multiple Settings Community-Based Traditional and Alternative Schools Day Treatment Centers Group Homes Residential Treatment Centers Detention and Correctional Facilities

  19. Practices and Strategies

  20. Logic Models

  21. Logic Model Development Resources • National Juvenile Justice Evaluation Center’s Logic Model Toolkit for Juvenile Justice Service Providers (http://www.jrsa.org/njjec/publications/logic_model_toolkit.pdf) • The W.K. Kellogg Foundation Logic Model Development Guide (http://www.wkkf.org/knowledge-center/resources/2006/02/wk-kellogg-foundation-logic-model-development-guide.aspx) • National Juvenile Justice Evaluation Center Logic Model Tutorial (http://www.jrsa.org/njjec/njjec-tutorial/category/learning3/) • The Office of Juvenile Justice and Delinquency Prevention’s (OJJDP) Logic Models page (http://www.ojjdp.gov/grantees/pm/logic_models.html) • The Bureau of Justice Assistance’s (BJA) Center for Program Evaluation and Performance Measurement’s “Developing and Working with Logic Models” page (https://www.bja.gov/evaluation/guide/pe4.htm)

  22. Sample Logic Models Developed by Programs Serving Neglected or Delinquent Children and Youth The Tribal Youth Logic Model (http://www.tribalyouthprogram.org/sites/tribalyouthprogram.org/files/5%20Logic%20Model%20-%20Final.pdf) Illinois’s Comprehensive Community-Based Youth Services (https://www.dhs.state.il.us/page.aspx?item=65902)

  23. Cost-Saving Strategies • Performance measurement strategies • Recruit local college students to collect and analyze data • Conduct online focus groups with staff and students using webcam technology and free online meeting software • Impact evaluation strategies • Partner with local organizations, agencies and/or universities to pursue federal, state or philanthropic funding • Partner with a vendor (e.g., offer to be a case study for a white paper featuring the use of their product in a sub-grantee facilities)

  24. Other Program Evaluation Resources • Bureau of Justice Assistance Center for Program Evaluation (https://www.bja.gov/evaluation/index.html) • Justice Research and Statistics Association Program Evaluation Briefing Series (http://www.jrsa.org/pubs/juv-justice/index.html) • Approaches to Assessing Juvenile Justice Program Performance • Hiring and Working With An Evaluator • Strategies for Evaluating Small Juvenile Justice Programs

  25. Contact Information Lauren Amos NDTAC TA Liaison American Institutes for Research lamos@air.org

More Related