1 / 83

Illinois Statewide Implementation of the Problem Solving/RtI Initiative NASP 2010 Annual Convention Hyatt Regency, Chica

Illinois Statewide Implementation of the Problem Solving/RtI Initiative NASP 2010 Annual Convention Hyatt Regency, Chicago, IL March 4, 2010. David Bell, St. Xavier University & I-ASPIRE Chicago, bell@sxu.edu Gary L. Cates, Illinois State University & I-ASPIRE Central, glcates@ilstu.edu

uta
Download Presentation

Illinois Statewide Implementation of the Problem Solving/RtI Initiative NASP 2010 Annual Convention Hyatt Regency, Chica

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Illinois Statewide Implementation of the Problem Solving/RtI InitiativeNASP 2010 Annual ConventionHyatt Regency, Chicago, ILMarch 4, 2010 David Bell, St. Xavier University & I-ASPIRE Chicago, bell@sxu.edu Gary L. Cates, Illinois State University & I-ASPIRE Central, glcates@ilstu.edu Kathryn Cox, Illinois State Board of Education, kcox@isbe.net Ben Ditkowsky, Lincolnwood SD 74 & I-ASPIRE North, ben@measuredeffects.com Sara Golomb, Doctoral Student, Loyola University Chicago, sgolomb@luc.edu Mark E. Swerdlik, lllinois State University & I-ASPIRE Central, meswerd@ilstu.edu

  2. Session Objectives In today’s presentation, we will: • Provide an overview of Illinois ASPIRE and the major project evaluation components • Discuss project evaluation results from three data sources: • Self-Assessment of Problem Solving Implementation (SAPSI) • Student Outcome Data • IHE Checklist (Review of selected educator preparation programs in Illinois) • Share some of challenges associated with data collection for project evaluation

  3. Illinois ASPIREAlliance for School-based Problem-solving & Intervention Resources in Education • 5-year State Personnel Development Grant from OSEP (now in Year 5) • Based on 10+ years of state experience with problem-solving in a 3-tier model • Primary Goal: Establish and implement a coordinated, regionalized system of personnel development that will increase the capacity of LEAs to provide early intervening services [with an emphasis on K-3 reading], aligned with the general education curriculum, to at-risk students and students with disabilities, as measured by improved student progress and performance.

  4. I-ASPIRE Objectives • Deliver research-based professional development and technical assistance. • Increase the participation of parents in decision-making across district sites. • Incorporate professional development content into IHE general and special education preservice curricula. • Evaluate the effectivenessof project activities.

  5. I-ASPIRE: Regional System for T.A. & Professional Development • 4 regional Illinois ASPIRE Centers • Illinois ASPIRE - Chicago: Chicago Public Schools • Illinois ASPIRE - North: Northern Suburban Special Ed. • Illinois ASPIRE - Central: Peoria ROE #48 • Illinois ASPIRE - South: Southern Illinois University Edwardsville • Collaboratives of LEAs, IHEs, regional providers and parent entities • Responsible for: • Training to districts and parents in region • General technical assistance (T.A.) • On-site T.A. to school demonstration sites

  6. Illinois T.A. Project Evaluation • Coordinated through Loyola University Chicago, Center for School Evaluation, Intervention & Training (CSEIT); http://www.luc.edu/cseit/i-aspire.shtml • Evaluation Framework: • If people are trained, do they implement? • If people implement, do they do so with fidelity? • If people implement with fidelity, do they sustain the practice(s) over time? • If people sustain the practice(s), what is the impact on student outcomes (school, group, individual)?

  7. I-ASPIRE Evaluation: Key Data Sources • Self-Assessment of Problem Solving Implementation (SAPSI): Assesses the degree of implementation of the problem solving process at the building level as self-reported by school sites • Fidelity of Implementation Checklist: Designed to assess the degree to which problem solving & RtI processes are implemented as intended; involves a review of products by external reviewer • Student Outcome Data: Involves analysis of universal screening, progress monitoring, and state assessment (ISAT) results • Parent Survey: Assesses participation (more than satisfaction) in the problem solving process of parents and guardians whose children are receiving Tier 3 interventions • IHE Checklist: Designed to assess the amount of RtI content incorporated into IHE general and special education pre-service and graduate curricula

  8. SAPSI and Fidelity Of Implementation Checklist Gary L. Cates Illinois State University I-ASPIRE Central

  9. Self Assessment of Problem-Solving Implementation (SAPSI) • School/Administration Focused Problem-Solving Survey • 25 questions • Completed twice per year • Action Planning Document • Developed over time and tweaked when necessary

  10. SAPSI

  11. SAPSI Outcomes

  12. SAPSI Outcomes

  13. SAPSI Outcomes

  14. SAPSI Outcomes

  15. SAPSI Outcomes

  16. SAPSI Outcomes

  17. SAPSI Outcomes

  18. Fidelity Checklist • External check of fidelity of implementation • Completed 1 time per year (Spring) • 5 “cases” from 5 randomly chosen schools • Inter-rater Reliability of evaluators >80% • Sources: SIP, IPF, Data files, CBM, Training Logs • Dichotomous scoring • Comments • Few Additional Scoring Guidelines for specific Items

  19. Fidelity Checklist

  20. Fidelity Outcomes

  21. Fidelity Outcomes

  22. Fidelity Outcomes

  23. Fidelity Outcomes

  24. Fidelity Outcomes

  25. Fidelity Outcomes

  26. Fidelity Outcomes

  27. Fidelity Outcomes

  28. Fidelity Outcomes

  29. Fidelity Outcomes

  30. Fidelity Outcomes

  31. Fidelity Outcomes

  32. Fidelity Outcomes

  33. Illinois – ASPIRENorthern Region Targeted Program Evaluation Ben Ditkowsky Lincolnwood School District 74 I-ASPIRE North

  34. Evaluation Question • Assumption(s) • Successful Implementation of RTI Will Increase Average Overall Achievement and in Particular, ROI for Students Who Receive Tiers 2 and 3 Intervention

  35. Facts • Increases in Achievement Come from Changes in Curriculum and Instruction, Fidelity of Implementation, Increased Behavior Support, etc

  36. Why Use Local Assessments, Such As CBM? • State-mandated tests assess outcomes • Local assessments allow us to: • Measure students earlier than 3rd grade • Monitor progress more frequently than once per year • Rely on multiple assessment tools for our information • Develop an integrated assessment system with benchmarks for performance, linked to a common outcome

  37. Question #1. Do scores on CBM Matter? Source data AIMSWEB

  38. How many words did they read in one minute? Dorothy read 94 Juan read 65 Mary read 44 Sam read 22

  39. Dorothy read 94 Juan read 65 Mary read 44 Sam read 22

  40. Exceed Standards Meets Standards Below Standards Academic Warning

  41. in Fall, Dorothy read 94 correct words in a minute She obtained a score of 169 on the state test in the spring

  42. Curriculum-Based Measurement is a measure of general reading competence Validity coefficients for R-CBM with the Comprehension subtest of the SAT were .91 as compared with Question Answering .82 , Recall .70, Cloze .72 (Fuchs, Fuchs & Maxwell, 1988) Validity coefficients for Text Fluency of Folk Tales with the Iowa Test of Basic Skills Comprehension was .83 (Jenkins, Fuchs, Espin, van den Broek & Deno, 2000) Fluency is causally related to reading comprehension (National Reading Panel -NICHD, 2000)

  43. Is fall Curriculum-Based Measurement related to state testing? 62% 38%

  44. Big Idea # 1. Scores on CBM are related to results of high-stakes testing

  45. Correct = 90% Correct = 81% Correct = 71% Correct = 71% Correct = 83% Correct = 88%

  46. CBM Is A Reliable Predictor Of ISAT Note. Data from 8 small to moderate school districts in the Northern Region of Illinois, 2008

  47. What Is The Probability Of Meeting Standards On The State Test? Note. Empirical confidence intervals constructed through bootstrapping 100 samples without replacement For a probability of .8 a student would have to read 77 WRC between (76 and 80 WRC) For a probability of .5 a student would have to read 53 WRC between (51 and 54 WRC) Probability of meeting standards on ISAT Fall R-CBM (Grade 3)

More Related