1 / 28

Personnel Development to Improve Services and Results for Children with Disabilities

Personnel Development to Improve Services and Results for Children with Disabilities PERFORMANCE MEASURES Craig Stanton Office of Planning, Evaluation, and Policy Development Bonnie Jones Office of Special Education Programs July 18, 2007. Program Assessment Rating Tool (PART).

meir
Download Presentation

Personnel Development to Improve Services and Results for Children with Disabilities

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Personnel Development to Improve Services and Results for Children with Disabilities PERFORMANCE MEASURES Craig Stanton Office of Planning, Evaluation, and Policy Development Bonnie Jones Office of Special Education Programs July 18, 2007

  2. Program Assessment Rating Tool (PART) Developed by the Office of Management & Budget – BPI Council. Designed to - • Provide a systematic method of assessing the performance of Federal programs across agencies • Create a better way of linking budget decisions to performance (e.g. Budget Performance Integration) • Incorporate a review of the broad array of factors that affect program outcomes • Build on and improve GPRA performance reporting & performance management

  3. Structure of the PART • Asks a series of yes/no questions, 4 sections I. Program purpose & design II. Strategic Planning III. Program Management IV. Program results and accountability • 25 basic questions, plus selected sets of questions tailored to specific types of programs (formula, discretionary, direct service, R&D, etc.) • Weights are assigned to each section and question

  4. How does a program earn a high PART rating? • Must have meaningful program outcome objectives, indicators, and efficiency measures • Must collect performance data, have baselines and ambitious targets • Must use data to manage program

  5. Scores Translated into PART Ratings

  6. Are PARTs and scores public? • Every program that has been reviewed using the PART has a summary at: www.expectmore.gov • Summaries link to detailed program assessments • Additional resources for PART, including detailed instructions for every question (PART Guidance) may be found at: http://www.whitehouse.gov/omb/part/index.html

  7. So what happens after the PART? • Agencies are held accountable for implementing PART follow-up actions, also known as improvement plans, for every PARTed program • Follow-up actions are program specific • Generally include actions like developing better measures, collecting better data, or developing a plan for evaluating the program.

  8. National Activities Program(Part D) State Personnel Development Grants Parent Training & Information Student Results Technology Personnel Development Technical Assistance, Model Demonstration, Dissemination, & Implementation

  9. OSEP Accountability Framework PROGRAM PERFORMANCE MEASURES SCHOLAR DATA PERSONNEL DATA PROJECT DATA

  10. OSEP Accountability Framework PROGRAM PERFORMANCE MEASURES SCHOLAR DATA PERSONNEL DATA PROJECT DATA Student Data Collection Service Obligation Annual Reports

  11. Measure 1 The percentage of Special Education Personnel Preparation projects that incorporate evidence-based practices in the curriculum • Annual review of syllabi by expert panel for all new grantees. • Leadership • High-Incidence (beginning in FY07 submitted at end of Year 1) • Low Incidence • Early Childhood • Minority • Related Services

  12. Measure 2 The percentage of scholars completing Special Education Personnel Preparation funded training programs who are knowledgeable and skilled in evidence-based practices - Praxis II Special Education (for teachers) - Other performance measures, for example, for related service personnel, or in states where measures other than Praxis II are used • Annual Performance Reports this year (FY 2004 grantees only); • Future collection through OSEP Student Data Report for all grantees

  13. Measure 3 The percentage of Special Education Personnel Preparation funded scholars who exit training programs prior to completion due to poor academic performance • Data collected through OSEP Student Data Report

  14. FY 2001 – FY 2005 Percent

  15. Measure 4 The percentage of low-incidence positions that are filled by personnel who are fully qualified under IDEA Proposed plan – • Data submitted by SEAs and collected through Section 618 data collection for Annual Report to Congress • Data collection to begin 2007-2008 school year

  16. Measure 5 The percentage of Special Education Personnel Preparation funded degree/certification program recipients who are working in the area(s) in which they were trained upon program completion • Data collected through OSEP Student Data Report

  17. FY 2002 – FY 2005 Percent

  18. Measure 6 The percentage of Special Education Personnel Preparation funded degree/certification program recipients who are working in the area(s) in which they were trained upon program completion and who are fully qualified under IDEA • Data collected through Annual Performance Report this year from FY 2004 grantees • Future data collected on OSEP Student Data Report

  19. Measure 7 Employed for three or more years:The percentage of degree/certification recipients who maintain employment for 3 or more years in the areas for which they were trained and who are fully qualified under IDEA • This year data collected through sample of grantees (9 institutions) • Future data collected from National Center on Student Obligation database.

  20. Measure 8 The percentage of funds expended on scholars who drop out of programs because of - • Poor academic performance • Scholarship support being terminated when the Federal grant to their institution ends • Data collected through OSEP Student Data Report

  21. SERVICE OBLIGATION

  22. Service Obligation—Which Rules Apply and When? There are three relevant sets of rules in play with respect to the service obligation: • The December 9, 1999 regulations will apply to individuals supported by grants awarded in FY2004 or earlier.

  23. Which Rules Apply and When? 2. The "Additional Requirements" section of the Personnel Preparation To Improve Services and Results for Children With Disabilities--Combined Priority and the Leadership notices for Personnel Preparation, published in the Federal Register on March 25, 2005, will apply to those awards made in FY 2005.

  24. Which Rules Apply and When? 3. The Final Regulations implementing Section 662(h) of the Amendments to the Individuals with Disabilities Education Improvement Act (IDEA) of 2004 (Federal Register (see 71 FR pp.32395-32400) became effective, July 5, 2006 and apply to those awards made in FY 2006 and later.

  25. How Does Continuation Funding Affect the Rules? Grantees may receive continuation funding in later years; however, whatever rules were in effect in the year that the grants were initially awarded will apply to all future years of these grants.

  26. How Will the Department Monitor the Service Obligation? • A contract will be awarded to monitor the service obligation and provide a website and web-based data collection system. • The website will provide information. • A helpline will also be available.

  27. Frequently Asked Questions Please submit questions you would like answered. Providing your contact information would be helpful so that we can ask you clarifying questions, if necessary.

  28. THANK YOU!

More Related