1 / 18

Performance Measurement Community Literacy

Performance Measurement Community Literacy. March 19, 2007 Harry P. Hatry The Urban Institute Washington DC. Key Distinctions. Performance Measurement vs. Program Evaluation Performance Measurement vs. Performance Management. PROGRAM PERFORMANCE EVALUATIONS MONITORING

tessa
Download Presentation

Performance Measurement Community Literacy

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Performance Measurement Community Literacy March 19, 2007 Harry P. Hatry The Urban Institute Washington DC

  2. Key Distinctions • Performance Measurement vs. Program Evaluation • Performance Measurement vs. Performance Management

  3. PROGRAM PERFORMANCE EVALUATIONSMONITORING Frequency: Irregular Regular, Continuing Coverage: Done on only Covers most a few programs programs Depth of Seeks reasons for Only tells “the Information: poor performance score”, not WHY Cost High for each study Cost spread out  Utility Major Decisions Continuous program Improvement

  4. Performance Measurement Information Plus Use of that Information to Improve Services Produces Performance Management

  5. Teachers Fed/State Implement Funds Changes to Provided Instructional Practice in Classroom Students Successfully Students Enrolled in Complete Post Secondary Education Education and/or Requirements Employed Outcome Sequence Chart Output “Intermediate” Outcomes Students Organization participate in Develops regular Improvement classroom Plan Instruction Students Demonstrate Improved Performance “End” Outcomes

  6. Teachers Fed/State Students Implement Funds participate in Changes to Provided regular Instructional classroom Practice in Instruction Classroom Quantity of • • # Schools with SEA #/% teachers • funds provided approved plans reporting changes Students Enrolled in Children Children with Post Secondary Successfully Disabilities Education and/or Complete General Demonstrate Employed Education Improved Requirements Performance #/% of students #/% of students • • • #/% of students enrolled in post who complete demonstrating improved secondary education, education requirements performance and/or employed Outcome Sequence Chart Output “Intermediate” Outcomes Organization Develops Improvement Plan #/% of students • participating in regular instruction “End” Outcomes

  7. Suggested Added Outcome Indicators • #/% of students whose test scores showed one year’s gain. [End Outcome ] • #/% of students reporting satisfaction with the assistance they received. [End Outcome] • # of students that volunteered for the assistance [Intermediate Outcome] • % of students enrolling divided by the number of eligibles. [Intermediate Outcome]

  8. TYPICAL SERVICE QUALITY CHARACTERISTICS 1. TIMELINESS/WAIT TIMES 2. PLEASANTNESS/FRIENDLINESS 3. CONVENIENCE/ACCESSIBILITY ·  OF HOURS OF OPERATION ·  CUSTOMER CAN REACH SOMEONE TO TALK TO 4. AWARENESS OF PROGRAM SERVICES 5. CLARITY OF INFORMATION/REGULATIONS 6. STAFF/TEACHER HELPFULNESS/KNOWLEDGE 7. OVERALL CUSTOMER SATISFACTION

  9. Categories of Data Sources and Collection Procedures • Agency Records • Administered Tests • Customer Surveys • Trained Observer Procedures • Expert Judgments • Focus Groups

  10. Sample Outcome Information From Customer Surveys • Ratings of overall satisfaction • Ratings of specific service quality characteristics • Ratings of results of the service • Whether actions/behavior sought by the program occurred • Extent of service use • Awareness of services • Reasons for dissatisfaction or non-use • Suggestions for improvements

  11. Making Performance Information Really Useful 1. Provide frequent, timely information to programs and their staffs. 2. Set targets each year. 3. Disaggregate outcome data by customer and service characteristics. 4. Do regular, basic, analysis of the data, such as comparisons. 5. Seek explanations for unexpected outcomes.

  12. Percent of Student That Reported The Program’s Assistance Had Helped Them Improve Their Reading Very or Difference Somewhat (Percentage Helpful Points) N Target All Clients 50% 60% - 10 560 Gender 30% 60% - 30 Females 230 64% 60% 4 Males 330 Beginning Reading Level 60% 60% 0 Lowest 100 55% 60% - 5 Second 220 44% 60% - 16 Third 180 33% 60% - 27 Highest 60 Faculty 53% 60% - 7 A 190 67% 60% 7 B 30 33% 60% - 27 C 150 58% 60% - 2 D 190

  13. 3% DEATH RATE 2,100 SURGERY PATIENTS 63 DEATHS 800 SURGERY PATIENTS 16 DEATHS 2% DEATH RATE MERCY HOSPITAL APOLLO HOSPITAL BUT… BUT… 1% DEATH RATE 1. 3% DEATH RATE 600 IN GOOD CONDITION 6 DEATHS 600 IN GOOD CONDITION 8 DEATHS 4% DEATH RATE 3.8% DEATH RATE 1,500 IN POOR CONDITION 57 DEATHS 200 IN POOR CONDITION 8 DEATHS Which Hospital Would You Choose?

  14. Types of Comparison  Compare the Latest Outcome Data: 1. To previous performance 2. To targets set by the organization 3. Among categories of customers 4. Among facilities 5. By type and amount of service 6. To results in other communities

  15. Making Performance Information Really Useful (Continued) 6. Hold “How Are We Doing?” sessions after each performance report. 7. Prepare “Service Improvement Action Plans” for areas with low performance. 8. Provide recognition rewards. 9. Identify successful practices.

  16. Website: http://www.urban.org/center/cnp/projects/outcomeindicators.cfm Outcome Indicators Project A joint project of the Urban Institute and The Center for What Works The Outcome Indicators Project provides a framework for tracking nonprofit performance. It suggests candidate outcomes and outcome indicators to assist nonprofit organizations that seek to develop new outcome monitoring processes or improve their existing systems. This website contains three primary elements: 1. Building a Common Outcome Framework to Measure Nonprofit Performance 2. Outcomes and Performance Indicators for 14 Specific Program Areas 3. Nonprofit Taxonomy of Outcomes Adult Education and Family Literacy Employment Training Advocacy Health Risk Reduction Affordable HousingPerforming Arts Assisted LivingPrisoner Re-entry Business Assistance Transitional Housing Community Organizing Youth Mentoring Emergency Shelter Youth Tutoring

  17. Crocodiles May Get You But in the End It Should be Very Worthwhile For Student Literacy

More Related