1 / 40

Indicator #14 SPP/APR Region VIII Employment Conference October 17, 2006 Dr. Greg Cooch BHSU

Indicator #14 SPP/APR Region VIII Employment Conference October 17, 2006 Dr. Greg Cooch BHSU. Three Distinct Eras of Special Education Era of Charisma Period before 1975/PL 94-142—only reason students received services was because parents or educational

hadar
Download Presentation

Indicator #14 SPP/APR Region VIII Employment Conference October 17, 2006 Dr. Greg Cooch BHSU

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Indicator #14 SPP/APR Region VIII Employment Conference October 17, 2006 Dr. Greg Cooch BHSU

  2. Three Distinct Eras of Special Education • Era of Charisma • Period before 1975/PL 94-142—only reason students • received services was because parents or educational • leaders insisted they be educated in public schools • Era of Equity • Period from about 1975 to 2000—sped focus was • ensuring equitable student services and concentrating • on “process of sped” e.g., dot “i’s”, cross “t’s”

  3. Special Education Eras--continued • Era of Accountability • From approximately 2000 to the present (and • Will probably be with us for some time) • The focus is on OUTCOMES • The issues is Acceptable Programming • procedures and Effectiveness of Special • Education (Kukic, Teleconference 4/14/04—MPRRC)

  4. Impairment • Handicap Disability • EnvironmentalPhysiological • -Stairs, Curbs, Attitudes -LD, SL,VI etc • -EHA (1975) -IDEA • -PL’s 94-142 & 99-457 -PL’s 101-476, 105-17, • 108-446 • -ACCESS FIRST -PEOPLE FIRST • Access to Gen Curr/ • OUTCOMES • -”LETTER OF THE LAW” Moving in this direction -”SPIRIT OF THE LAW” • -Goal: Access schools to meet -Goal: Full equal educational • minimum requirements of the opportunity in gen curr. and • Education for all Handicapped meaningful outcomes/ • Children Act Accountability

  5. Special Education has entered an era of • increasing accountability • Mounting Pressure to document impact of Special • education instruction on academic achievement

  6. State Performance Plans (SPP) & Annual Performance Reports (APR) • States are required to describe in their SPP a 6-year plan to address the 5 monitoring priorities and 20 indicators for students with disabilities • The APR will be submitted annually to document progress toward addressing those priorities and indicators

  7. Targets---OUTCOMES • Each state is required to develop measurable and • rigorous targets for each of the priorities and indicators • Purpose for collecting data on measurable and rigorous • targets is: • Accountability for Sped Program • Help guide systemic improvement

  8. The State Performance Plan (SPP) • The SPP (FFY 2005-2010) was submitted Dec. 2, 2005 • The SPP is to be reviewed by the State at least once every • six years • SD SPP was approved by the Secretary of Education

  9. Annual Performance Report • The APR is to be submitted annually • The first APR will be due February 7, 2007 • The State shall report annually to the public on the • performance of each LEA (Local Education Agency) • in the State on the targets in the SPP

  10. Statutory Requirements: SPP • (State Performance Plan) • States shall use the targets in the SPP to analyze • the performance of each Local Education Agency • (School District) in the State • The State shall report annually to the public on • the performance of each LEA Program in the • State on the targets in the SPP

  11. SPP Content • Overview of the System or Process • Baseline Data • Discussion of Baseline Data • Stakeholder Input • Measurable and Rigorous Targets • Improvement Activities/Timelines/Resources • The SPP asks how the State • Obtained broad stakeholder input, and • Will disseminate SPP to the public

  12. APR Content • In APR’s, States will provide the following: • Actual performance against each target • Discussion of improvement activities completed and • explanation of progress or slippage • Any revisions to proposed targets, improvement • activities, timelines or resources---with justification

  13. Bottom Line • Improvement activities are designed to meet targets • Targets are measurable and reflect improvement • Baseline data is present, clear, and measurable • Data is valid • Required information is included

  14. Monitoring Priority Areas: Part B • 20 USC 1416(a)(3) • FAPE in the LRE • Disproportionality • Effective General Supervision • Child Find • Effective Transition • General Supervision

  15. Monitoring Priorities (Indicators) • FAPE in the LRE • 8 Indicators • Disproportionality • 2 Indicators • Effective General Supervision • 10 Indicators

  16. The 5 monitoring Priorities with respective Indicators: • Monitoring Priority 1: FAPE in the LRE • Indicators 1, 2, 3, 4, 5, 6, 7, 8 (Indicators 7 & 8 are New) • Monitoring Priority 2: Disproportionality • Indicators 9 & 10 (both New) • Monitoring Priority 3: Effective General Supervision Part B-Child Find • Indicator 11 (New) • Monitoring Priority 4:Effective General Supervision Part B-Effective Transition • Indicators 12, 13, 14 (13 & 14 are New) • Monitoring Priority 5:Effective General Supervision Part B-General Supervision • Indicators 15, 16, 17, 18, 19, 20 (18 is New) • HANDOUT-

  17. Most of the indicators specified in the SPP relate to • student performance while they are still in the public • school system • Indicator #14 is different because it requires states to • document the post-school outcome experiences (i.e., • Competitive Employment and Post-Secondary Education

  18. Indicator #14 reads as follows: Percent of youth who had IEP’s, are no longer in secondary school and who have been competitively employed, enrolled in some type of postsecondary school, or both, within one year of leaving high school

  19. Indicator #14 suggests that employment and postsecondary experiences reflect— The ultimate purpose of our K-12 system—to prepare students to become contributing citizens in our society

  20. Indicator #14 will present some unique challenges to States. • Students to be included in this data collection system • are no longer students and; • 2. Reporting designated outcomes cannot be gathered • through the completion of a test within the confines • of the school building

  21. In order to gather accurate information for Indicator #14 it will be necessary to query young people who were on IEP’s after their exit from high school Only through the survey method asking pertinent questions related to Indicator #14 will states be able to collect the necessary data on educational and employment experiences

  22. States need to determine whether they will gather sample data from somestudents and if sampling which students to Include. Some states will choose to sample and gather data from allschool leavers. SD has decided to gather data from all school leavers since the number of leavers in the state is relatively small. Larger states will use representative sampling in collecting data.

  23. Statutory Requirements • Shall be submitted for approval by the Secretary • Must be reviewed every 6 years • State must collect valid and reliable information as needed • to report every year • State shall report annually to the Secretary on the • performance of the State on the SPP • Source: Ruth Ryder, Director Division of Monitoring and State Improvement Planning OSEP Mar 2006 Portland

  24. IDEA Purpose (d) (1) (A) to ensure that all children with disabilities have available to them a FAPE that emphasizes special education and related services designed to meet their unique needs and prepare them for further education, employment and independent living Source: Jane Falls, Project Coordinator National Post School Outcomes Center, University of Oregon March 2006, Portland.

  25. Data Collection Procedures The Who, What, How, When, and by Whom

  26. The WHO: • Who are data collected on? • All graduates/completers • Aged-out of school (age 21) • Early Leavers/Dropouts

  27. The WHAT? • What data are collected?? • In-school • Contact information • Demographic characteristics • Leaving status • Is extant data available • Post-school • 1 year out of school • Between April and September

  28. The HOW: • How are data collected?? • Extant data—In SD case because we don’t have • access to existing data a secured website was developed • Survey Methodology • Phone survey • Mail survey • Web-based • Combination

  29. The WHEN: • When are the data collected?? • In-school • During the last year of attendance • Method for capturing early leavers • Post-school • 1 year out of school • Between April and September

  30. By WHOM?? • LEA staff • Former teachers • Support staff • SEA staff • Contracted party • SD has opted for the latter

  31. Data Use and Requirements for Federal Reporting

  32. Indicator #14 Timelines • 12/02/05 Plan to collect data • 02/01/07 Status report of exiters (Appendix A) • Referred to as the “Anchor Point” • 02/01/08 Progress report of exiters, (Appenix B) including baseline, targets and improvement activities • Source: OSEP/ Ruth Ryder, Portland

  33. Statutory Requirements • State shall use targets in the SPP to analyze the • performance of each LEA in the state • State shall report annually to the public on the • performance of each LEA in the state on the • targets in the SPP

  34. Reporting • Annually on the Performance of the State • To the Secretary • To the Public (N>10) • Annually on the Performance of the LEA’s in • the state • Each LEA, each year, each indicator

  35. Reporting Questions • Currently, OSEP doesn’t have a template for design • of report • Questions that have been raised by states include: • Format and Mechanism for reporting? • Will you compare districts to overall state • performance? • Will you compare districts of similar size • and location?? • Will you include narrative for qualitative info? • Tables, Charts, Graphs????

  36. Stakeholders for the GPRA (Gov’t Performance and • Results Act) have established the following Focus Areas • for Transition: • Promote programs that achieve a balance between academic • academic achievement and participation in employment • (this is the “heart of Indicator #14) • Develop a broad range of performance measures to assess • student outcomes (still struggling with this—how do we monitor 3-5 years out) • Increase collaboration among stakeholder agencies for • long term success • Promote early student and family involvement with Transition • planning emphasizing self-determination • Support and disseminate model programs of evidence-based • success in meeting needs of transition-aged students/families • Source: Marlene Burroughs Assoc. Director, Research to Practice, OSEP

  37. Concern area!!!! Hard-to-find youth • How to increase response rate— • States that have been tracking graduates/leavers say this is • the most difficult group to find • Other groups that are difficult include: • ED • Homeless • DOC • Certain cultural groups • Children in Larger urban areas • Foster Care • Several others as well—Migrant, home school etc.

  38. Strategies to find exiters/leavers • Multiple methods of contact—phone, letter, email • Internet sites—some states have set these up • Know where there friends are—phone #’s, email address • (one state reported this was the best way to find this population • Bottom Line: Most states report they have had a hard • time tracking this population

  39. Additional Strategies to find exiters: • Parent contact (North Dakota) • Pre-contact letter prior to survey (Idaho) • Schools that have a good relationship with exiters • seem to be easier to find e.g., school personnel know • Bottom Line: Probably will be THE significant problem • Other issue: Districts have an incentive to not find • Dropouts since data will be publicly reported

  40. References • MPRRC—John Copenhaver • OSEP—Marlene Simon-Burroughs, Assoc. Director • OSEP—Ruth Ryder, Director • National Post School Outcomes—Deanne Unruh, Project Assoc. • National Post School Outcomes—Jane Falls, Coordinator • National Post School Outcomes—Mike Bullis, Director • Center for Change in Transition Services, University of Seattle— • Cinda Johnson, Director • SD Dept of Education—Special Education Office

More Related