1 / 70

NRS Data Monitoring for Program Improvement

NRS Data Monitoring for Program Improvement. Unlocking Your Data. Objectives—Day 1. Describe the importance of getting involved with and using data; Identify four models for setting performance standards as well as the policy strategies, advantages,and disadvantages of each model;

Download Presentation

NRS Data Monitoring for Program Improvement

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. NRS Data Monitoring for Program Improvement Unlocking Your Data M. Corley

  2. Objectives—Day 1 • Describe the importance of getting involved with and using data; • Identify four models for setting performance standards as well as the policy strategies, advantages,and disadvantages of each model; • Determine when and how to adjust standards for local conditions; • Set policy for rewards and sanctions for local programs; • Identify programmatic and instructional elements underlying the measures of educational gain, NRS follow-up, enrollment, and retention. M. Corley

  3. Agenda—Day 1 • Welcome, Introduction, Objectives, Agenda Review • The Power of Data • Why Get Engaged with Data? Exercise • The Data-driven Program Improvement Model • Setting Performance Standards • Adjusting Standards for Local Conditions • Establishing a Policy for Rewards and Sanctions • Getting Under the Data • Data Pyramids • Data Carousel • Evaluation and Wrap-up for Day 1 M. Corley

  4. Objectives—Day 2 • Distinguish between the uses of desk reviews and on-site monitoring of local programs; • Identify steps for monitoring local programs; • Identify and apply key elements of a change model; and • Work with local programs to plan for and implement changes that will enhance program performance and quality. M. Corley

  5. Agenda—Day 2 • Agenda Review • Planning for and Implementing Program Monitoring • Desk Reviews Versus On-site Reviews • Data Sources (small group work) • Steps and Guidelines for Monitoring Local Programs • Planning for and Implementing Program Improvement • A Model of the Program Improvement Process • State Action Planning • Closing and Evaluation M. Corley

  6. STOP! Why Get Engaged with Data? M. Corley

  7. Question for Consideration Why is it important to be able to produce evidence of what your state (or local) adult education program achieves for its students? M. Corley

  8. The Motivation Continuum IntrinsicExtrinsic Which is the more powerful force for change? M. Corley

  9. NRS Data-driven Program Improvement (Cyclical Model) STEPS • Set performance standards • Examine program elements underlying the data • Monitor program data, policy, and procedures • Plan and implement program improvement • Evaluate progress and revise, as necessary, and recycle M. Corley

  10. What’s Under Your Data?The Powerful Ps __Performance_(Data)_ Program Policies Procedures Processes Products M. Corley

  11. NRS Data-driven Program Improvement Model Set Performance Standards NRS DATA Plan and Implement Program Improvement; Evaluate Improvement Examine Program Elements Underlying the Data Monitor Program Data, Policy, Procedures M. Corley

  12. Educational Gains for ESL Levels and Performance Standards Exhibit 1-2 M. Corley

  13. Questions Raised by Exhibit 1-2 • How were performance standards set? Based on past performance? • Are standards too low at the higher levels? • Is performance pattern similar to that of previous years? If not, why not? • What are program’s assessment and placement procedures? Same assessments for high and low ESL? • How do curriculum and instruction differ by level? • What are student retention patterns by level? M. Corley

  14. The Power of Data: Setting Performance Standards M. Corley

  15. Essential Elements of Accountability Systems • Goals • Measures • Performance Standards • Sanctions and Rewards M. Corley

  16. National Adult Education Goals • educational gain, • GED credential attainment, • entry into postsecondary education, and • employment. Reflected in NRS Outcome Measures of M. Corley

  17. Performance Standards • Similar to a “sales quota”: how well are you going to perform this year? • Should be realistic and attainable, but • Should stretch you toward improvement • Set by each state in collaboration with ED • Each state’s performance is a reflection of the aggregate performance of all the programs it funds M. Corley

  18. Standards-setting Models • Continuous Improvement • Relative Ranking • External Criteria • Return on Investment (ROI) M. Corley

  19. Continuous Improvement • Standard based on pastperformance • Designed to make all programs improve compared to themselves • Works well when there is stability and a history of performance on which to base standard • Ceiling reached over time, resulting in little additional improvement M. Corley

  20. Relative Ranking • Standard is mean or median performance of all programs • Programs ranked relative to each other • Works for stable systems where median performance is acceptable • Improvement focus mainly on low-performing programs • Little incentive for high-performing programs to improve M. Corley

  21. External Criteria • Set by formula or external policy • Promotes a policy goal to achieve a higher standard • Used when large-scale improvements are called for, over the long term • No consideration of past performance: unrealistic, unattainable M. Corley

  22. Return on Investment • Value of program :: Cost of program • A business model; answers question, Are services or program worth the investment? • Can be a powerful tool for garnering funding (high ROI) or for losing funding (low ROI) • May ignore other benefits of program M. Corley

  23. Decision Time for State Teams • Which model(s) do you favor for setting standards for/with locals? • Is it appropriate to use one statewide model or different models for different programs? • How will you involve the locals in setting the standards they will be held to? M. Corley

  24. Question for Consideration How do the standard-setting model(s) that states select represent a policy statement on the relationship between performance and quality that states want to instill in local programs? M. Corley

  25. Adjusting Standards for Local Conditions Research suggests that standards often need to be adjusted for local conditions before locals can work to improve program quality. WHY IS THIS SO? M. Corley

  26. Factors that May Require Adjustment of Standards • Student Characteristics • An especially challenging group • Students at lower end of level • Influx of different types of students • Local Program Elements • External Conditions M. Corley

  27. Shared Accountability State and locals share responsibility to meet accountability requirements • State provides tools and environment for improved performance • Locals agree to work toward improving performance M. Corley

  28. Locals should know… • The purpose of the performance standards; • The policy and programmatic goals the standards are meant to accomplish; • The standard-setting model that the state adopts; and • That State guidance and support is available to locals in effecting change. M. Corley

  29. Shared Accountability • Which state-initiated efforts have been easy to implement at the local level? • Which have not? • What factors contributed to locals’ successfully and willingly embracing the effort? • What factors contributed to a failed effort? M. Corley

  30. High Locals Out of Control?? Hot Dog!! We’re really moving! Local ProgramInvolvement Anything Happening Out There?? Get OFF our backs!! Low Low High State Administrative Control Shared Accountability M. Corley

  31. What About Setting Rewards and Sanctions? • Which is the more powerful motivator: rewards or sanctions? • List all the different possible reward structures you can think of for local programs. • How might sanctioning be counter-productive? • List sanctioning methods that will not destroy locals’ motivation to improve oradversely affect relationships with the state office. M. Corley

  32. Variations on a Theme Exercise • (Refer to H-10). Brainstorm as many possible rewards or incentives as you can for recognizing local programs that meet their performance standards. • Then brainstorm sanctions that the state might impose on local programs that do not meet their performance standards. • Select a recorder for your group to write one reward per Post-It Note and one sanction per Post-It Note. • When you have finished, wait for further instructions from the facilitator. M. Corley

  33. Summary of Local Performance Standard-setting Process M. Corley

  34. Getting Under the Data NRS data, as measured and reported by states, represent the product of underlying programmatic and instructional decisions and procedures. M. Corley

  35. Four Sets of Measures • Educational gain • NRS Follow-up Measures • Obtained a secondary credential • Entered and retained employment • Entered postsecondary education • Retention • Enrollment M. Corley

  36. Educational Gain Assessment Policies and Approach Assessment Procedures I n s t r u c t i o n Goal Setting and Placement Procedures Retention Class Organization Professional Development Educational Gain M. Corley

  37. GED Employment Postsecondary Instruction Support Services Tracking Procedures G o a l-S e t t i n g Retention Professional Development Follow-up Measures M. Corley

  38. Retention Students Class Schedules and Locations Placement Procedures I n s t r u c t i o n Support Services Retention Support and Policies Professional Development Retention M. Corley

  39. Enrollment Enrollment Community Characteristics Class Schedules and Locations R e c r u i t m e n t Instruction Professional Development M. Corley

  40. Data Carousel M. Corley

  41. Question for Consideration How might it benefit local programs if the State office were to initiate and maintain a regularmonitoring schedule to compare local program performance against performance standards? M. Corley

  42. Regular Monitoring of Performance Compared with Standards • Keeps locals focused on outcomes and processes; • Highlights issues of importance; • Increases staff involvement in the process; • Helps refine data collection processes and products; • Identifies areas for program improvement; • Identifies promising practices; • Yields information for decision-making; • Enhances program accountability. M. Corley

  43. BUT… • How can states possibly monitor performance of all local programs? • Don’t we have enough to do already?? • Where will we find staff to conduct the reviews? • You’re kidding, right?? M. Corley

  44. Not! M. Corley

  45. So….Let’s Find Some Answers • How can you monitor performance of locals without overburdening state staff? • What successful models are already out there?? • How does your state office currently ensure local compliance with state requirements? • Can you build on existing structures? M. Corley

  46. Desk Reviews Ongoing process Useful for quantitative data Proposals Performance measures Program improvement plans Staffing patterns Budgets On-site Reviews Single event, lasting 1-3 days Useful for qualitative data Review of processes & program quality Input from diverse stakeholders Approaches to Monitoring M. Corley

  47. Advantages and Disadvantages of Desk Reviews M. Corley

  48. Advantages and Disadvantages of On-site Reviews M. Corley

  49. Data Collection Strategies for Monitoring • Program Self-Reviews (PSRs) • Document Reviews • Observations • Interviews M. Corley

  50. Program Self-Reviews • Conducted by local program staff • Review indicators of program quality • Completed in advance of monitoring visit and can help focus the on-site review • Results can guide the program improvement process M. Corley

More Related