1 / 39

Fall 2005-

Fall 2005-. PERFORMANCE BASED MONITORING ANALYSIS SYSTEM. PBMAS Topics. What is PBM? What is Evaluated, and When? What Standards are Used? What Interventions are Required? Implications for Cooper ISD…. PBMAS 2005. What is PBM?. On site visit Every 5 years, or according to “risk”

Download Presentation

Fall 2005-

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Fall 2005- PERFORMANCE BASED MONITORING ANALYSIS SYSTEM

  2. PBMAS Topics • What is PBM? • What is Evaluated, and When? • What Standards are Used? • What Interventions are Required? • Implications for Cooper ISD…

  3. PBMAS 2005 What is PBM?

  4. On site visit Every 5 years, or according to “risk” Subjective “Here & Now” Desktop monitoring Every year, on site after non-compliance Data driven Prior data- from last year at best From DEC to PBM

  5. DEC & PBM Common Ground • District level evaluations • Evaluations based on special programs and student populations • Federal mandates implemented by state agencies

  6. PBMAS 2005 What is evaluated… and When?

  7. 4 Program Areas • ESL / Bilingual (ELL) • CATE (CTE) • SPED • NCLB

  8. NCLB Areas • Economically Disadvantaged • Migrant • Limited English Proficient • Highly Qualified Teachers • Safe & Drug-Free Schools

  9. TESTING TAKS (Feb, April of ‘05) SDAA II (Feb, April of ’05) RPTE (March of ’05) PEIMS October Snapshot (2004) June Submission (2004) Data Collected for each Program area

  10. Data Disaggregation • Currently there are 115 indicators assessed • Each area broken down into groups by ethnicity, gender, combinations of programs, etc. • Indicators are added each year (new for ’05- CATE RHSP %) • Standards

  11. PBMAS 2005 What Standards are Used?

  12. A Combination of Standards • AEIS ratings • AYP goals • State averages • State targets

  13. Ratings per indicator in each area • 0 = MET STANDARD • 1= Did not meet… • 2= Further away from… • 3= Farthest away…

  14. PBMAS 2005 What Interventions are Required?

  15. Assigned stage based on number of 3’s … combined with calculation of numerical average of all indicators … within each program area

  16. Stage 1A(our rating for 2004) • Assemble Core Analysis Team • Perform Focused Data Analysis • Create & Implement Continuous Improvement Plan (CIP) • Retain local documentation

  17. Stage 1B(projected 2005 rating) • ADDITIONAL mandates requiring data analysis and ongoing documentation in 10 areas • Ex- Extended School Year planning, analysis of outcomes, documentation • Submit all documents • Assignment of TEA representative

  18. Stage 2 • ADDITIONALLY- hold a series of Public Meetings for information and input to CIP

  19. Stage 3 • ADDITIONALLY- complete compliance reports • Probable OCR Visit • (Failure to reach compliance leads to possible loss of funding, etc.)

  20. PBMAS 2005 Implications for Cooper ISD

  21. Bilingual / ESL area • Not enough students to count, even when multiple years are added together • 6 additional students could mean evaluation of some areas in later years • If evaluated, some current percentages would not meet standards

  22. Career / Tech Ed area • MET STANDARD on ALL INDICATORS

  23. No Child Left Behind • Met all standards AND • Met all AMAOs

  24. Special Education • “0” = 9 indicators • “1” = 5 indicators • “2” = 2 indicators • “3” = 2 indicators

  25. TAKS Math TAKS Reading/LA TAKS S.Studies TAKS Writing EXIT TAKS Math EXIT TAKS Rdg/LA Exemptions from all testing Dropout Rate Discretionary Expulsions SpEd “0’s” for CISD

  26. SpEd “1’s” for CISD • SDAA Gap Closure Math * • Less Restrictive Environment (age 3-11) • Hispanic Representation % • Discretionary DAEP Placements * • Discretionary ISS Placements * improvement from 2004 % decline from 2004

  27. SpEd “2’s” for CISD • SDAA Gap Closure Reading • Less Restrictive Environment (age 12-21) % * improvement from 2004 % decline from 2004

  28. Sped “3’s” for CISD • SpEd Identification (Percent of student body) % • Identification of African American SpEd students (disproportionate percent) * improvement from 2004 % decline from 2004

  29. PBMAS 2005 Continuous Improvement Plans

  30. CIP: SpEd Identification Data analysis showed that although percent identified on PEIMS submissions had already risen, new referrals for 2004-2005 were less than half of that from previous years

  31. CIP: SpEd Identification • Revamped Study Team at Elementary level with three phases of interventions before SpEd referral • Created Study Team process at Jr/Sr High campus to screen referrals

  32. CIP: African American Representation Data analysis showed that majority of SpEd referrals reflect low SES rather than solely ethnicity

  33. CIP: African American Representation • Ruby Payne training to include intervention strategies for low SES students • Summer school & after school tutoring opportunities for At Risk students • Early literacy parent workshop scheduled for elementary campus

  34. CIP: SDAA Reading Gap Data analysis showed gap increases as students promote through grade levels

  35. CIP: SDAA Reading Gap • Training for teachers, and monitoring of lesson plans, for inclusion of all TEKS in resource classes • Scheduling procedure for secondary resource classes to even out class size • Transition team created to align instruction for 5th graders promoting to 6th grade

  36. CIP: DAEP Placement * (data analysis showed improved percentages already submitted in summer PEIMS) • Continue efforts to track student discipline with focus on SpEd students * Item dropped from official CIP- goal is to maintain progress

  37. CIP: Less Restrictive Environment % (% Not addressed in previous data analysis b/c not an issue at that time. Will be addressed in upcoming revision of CIP) CIP will include ARD committee training

  38. PBM Core Team • Anne Mills • Toby Howard • Michael Ramsay • Doug Wicks • Denicia Hohenberger • Tonya Potts • Janie McMackin • Paula Rackley • Debbie Grider • Stephen Bonner • Jeanette Burnett • Reg Ed Teacher:

  39. Questions & Answers

More Related