1 / 54

Washington State Using Data to Drive Statewide Improvement Efforts

This webinar discusses the evaluation approach used in Washington State to drive statewide improvement efforts, focusing on program theory, treatment, and impact evaluation. The early and preliminary findings of the evaluation are also presented.

robertpoole
Download Presentation

Washington State Using Data to Drive Statewide Improvement Efforts

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Washington StateUsing Data to Drive Statewide Improvement Efforts Greg Roberts, Evaluation Research Services (ERS) Leslie Pyper,SPDG Director, Learning Improvement Coordinator, Special Education, OSPI SPDG Director’s Webinar November 17, 2010

  2. Part 1 (Greg) Part 2 (Leslie) Thinking about Systems Leveraging Evaluation Efforts Creating Systemic Change in WA Evaluation Approach Application of Evaluation Approach Early and Preliminary Findings

  3. Evaluation Approach Theory-driven evaluation Program theory - what must be done to achieve desired goals, what other important impacts may also be anticipated, and how these goals and impacts would be generated Normative theory and causative theory Treatment refers to the services, materials, and activities thought to be essential to generating desired changes.

  4. Evaluation Approach Program model as a vehicle for talking about exceedingly complex social phenomena Basis for initially capturing important information Basis for clearly and usefully reporting evaluation activities and findings Means of identifying important research questions Means of identifying program elements and outcomes for measurement Mechanism for Answers for identifying important features of an implementation environment(s)

  5. Evaluation Approach Causative theory (see figure above) represents empirical and substantive knowledge of the relationships that link a program’s treatments, its implementation processes, and its intended outcomes. Impact theory and impact evaluation are perhaps the most well advertised elements of this framework.

  6. Washington State Application • Program theory/model • Evaluation questions • Evaluation design • Measures, indicators • Procedures • Survey • Site Visit • Extant Data

  7. Program Model

  8. Evaluation Questions To what extent are the School-wide Activities in Figure 1 implemented in funded districts? To what extent do Local Circumstances in Figure 1 inhibit or facilitate implementation of School-wide Activities? To what extent are School-wide Activities related to the Change Mechanisms in Figure 1? To what extent are School-wide Activities related to Outcomes in Figure 1? To what extent are Change Mechanisms related to Outcomes in Figure 1?

  9. Evaluation Questions To what extent are the School-wide Activities in Figure 1 implemented in funded districts? What Local Circumstances do districts and schools find challenging? What are the 2008-2009 levels of student outcomes in the funded and matched districts?

  10. Evaluation Design

  11. Early and Preliminary Findings Question 1: To what extent are the School-wide Activities in Figure 1 implemented in funded districts? Level of Implementation

  12. Early and Preliminary Findings Question 1: To what extent are the School-wide Activities in Figure 1 implemented in funded districts? Assessment Knowledge

  13. Early and Preliminary Findings Question 1: To what extent are the School-wide Activities in Figure 1 implemented in funded districts? Frequency of Assessment

  14. Early and Preliminary Findings Question 1: To what extent are the School-wide Activities in Figure 1 implemented in funded districts? Frequency of Decision-making

  15. Early and Preliminary Findings • Question 2: What Local Circumstances do districts and schools find challenging? • Teachers and others tend to endorse practices that are RTI-aligned, but mostly when they are not described in terms of RTI • Relatively limited understanding on what RTI is and how it works, and a number of site visit participants had questions about the degree to which their “model” aligns with research or best practice • Implementation theory

  16. Early and Preliminary Findings • What are the 2008-2009 levels of student outcomes in the funded and matched districts? • Percentage of students passing the 2008 WASL in Reading and Math, respectively • Baseline for ongoing comparison of changes in student outcomes

  17. State Assessment - Reading

  18. State Assessment - Math

  19. Ongoing and Next Steps Small-scale study in one demo site (Walla Walla) Analysis of Year 2 data (2009-2010) Collection of Year 3 data (2010-2011) Refocusing program model and evaluation questions

  20. Feedback Pod Yes/No • Are you working with an evaluator in • developing and/or monitoring • implementation of RTI in your state?

  21. “Cheshire,” Alice began rather timidly, “Would you tell me please, which way I ought to go from here?” “That depends a good deal on where you want to get to,” said the Cat. “I don’t much care-” said Alice. “Then it doesn’t matter which way you go,” said the Cat. Alice’s Adventures in Wonderland by Lewis Carrol Michigan Dept Ed. ppt 2-16-09

  22. State Leadership Team All special ed folks Demo Districts 295 School Districts Educational Service Districts ESDs OSPI (SEA) Special Ed SPDG Legislated as separate entities Local Control

  23. Feedback Pod 2. In my state, regional technical assistance and training is provided … A. directly by the SEA. B. by other entities with direct guidance/training from the SEA to those entities. C. by other entities withoutcoordinated influence from the SEA.

  24. 295 districts divided among 9 regional ESDs Leslie = ESD liaison for special ed CSA Improvement planning - by region, using data profiles Root causes (linking activities to root causes) Quarterly updates Evaluation of activities * May choose to focus some efforts on RTI and/or PBIS… Spokane* *Seattle Olympia*

  25. System Alignment Federal State Local Education Agencies TA System Teachers/Staff Effective Practices Fixsen, 2008

  26. WA System Alignment Federal State (OSPI) IHEs Teacher Prep/ Tech Asst TA System (ESDs) Special Ed State Needs Projects LEAs Teachers/Staff Effective Practices Adapted from Fixsen, 2008

  27. Other TA efforts --- State Needs Projects Projects are supposed to address a “state need” -- Hmmmmmm… Realized that there wasn’t a process -- • to identify a “state need” • to ensure consistency/alignment • to provide training/support to the project staff Needed a “system” to manage these projects and needed a system to provide TA to them.

  28. Contracted with UW-Tacoma to evaluate the State Needs Projects • Are they serving statewide? • Are they using resources/materials aligned • with State policies and current research? • Are they providing on-going support? • Are they collecting reliable data? • Are they making adjustments based on • evaluation of their efforts? • Are they having an impact on practice? • Are they having an impact on indicator • performance?

  29. Build a sustainable State structure for RTI Build capacity of SEA, ESDs to support districts • Engage SEA departments/ESDs in strategic planning • Multiple depts + ESDs = Implementation Team • NCRTI – NWRCC – WRRC • Need to develop common language • Need to align messaging/PD/use of coaches… • Need to develop review/vetting process for RTI • Develop strategic plan for support

  30. 295 School Districts Coordinated Services Agreement (CSA) w/ ESDs (regional TA providers) SEA/ESD Implementation Team Special Ed SPDG

  31. OSPI Implementation Team - really should be called the Transformation Team Teaching & Learning (Reading, Math) Classroom Assessment District & School Improvement Special Education Migrant/bilingual/ELL Title I/LAP Highly Capable Equity & Civil Rights Secondary Ed & Dropout Prevention Student Achievement (Achievement Gap) Ed Technology Financial Resources & Governmental Relations (Data Governance) Educational Service Districts (ESDs)

  32. District & School Improvement & Accountability Dropout prevention Comprehensive Assessment System This would begin to align many efforts at the SEA – which would also impact the ESDs… K12 Reading Model New Math framework SPDG

  33. In order to develop a strategic plan for the state, we needed to understand WHAT everyone was doing, and HOW IT CONNECTED to the RTI structure. Implementation Team developed an assessment tool: (OSPI Efforts Inventory) Assessing activities across SEA (pilot…11/12/10…two departments)

  34. OSPI Efforts Inventory Department Contact Person(s) Date

  35. Improved instruction for students

  36. Feedback Pod True/False 3. We have developed common language across the SEA and our resources are aligned and collectively support the RTI framework.

  37. Build a sustainable State structure for RTI Build capacity of IHEs • engaged IHEs in IRIS Center training • recruited IHE reps on State LT (gen & sped) • engaged multiple IHEs in regional training • meeting with several IHEs re: RTI efforts UW-T 325T grant – participating on Advisory Board Project RTI: Restructuring, Transforming, Implementing a Dual-track RTI Teacher Preparation Program

  38. Feedback Pod True/False 4. We have developed common language across the state (IHEs, SEA, regional TA providers) and our resources are aligned and collectively support the RTI framework.

  39. Shewhart (1924); Deming & Juran (1948); Six-Sigma (1990) Plan – Decide what to do Do – Do it (be sure) Study – Look at the results Act – Make adjustments Cycle – Do over and over again until the intended benefits are realized PDSA Cycles: Trial & Learning Dean L. Fixsen & Karen A. Blasé, 2009

  40. “Knew”: • Lack of understanding of pilot site requirements • Lack of consistent PD opportunities • Lack of a state structure for PD • Lack of involvement of SEA beyond sped • lack of understanding about connections • lack of alignment • no vetting of materials • Lack of common understanding of RTI framework • Lack of data system to manage RTI efforts • Lack of preparation for leadership of RTI efforts • Likely a lack of fidelity

  41. Evaluation results – • 1-2 PD opportunities (avg) • More than half gave incorrect response or no • response regarding defining screening & progress • monitoring • Collecting data but teachers struggling to use data • Basic knowledge of RTI lacking • Lack of buy-in • Difficulty moving students through tiers • Difficulty in using staff to provide interventions • Is anyone monitoring fidelity?

  42. So, need to develop a system to support demo sites and all others who want to implement RTI: • Develop a state structure for support of RTI • develop common language • develop a strategic (state-level) plan • align efforts (messaging, resources, training,…) • Purchase/develop a data system for demo sites • Provide ‘hands on’ support to demo sites

  43. Training for RTI demo sites • Screening /progress monitoring • Selecting EBPs • Adapting on-site technical assistance • Developing monthly TA calls (in addition to quarterly mtgs) • Developing a Guide for Selection of EBPs • “ “ Guide for Selection of Trainers/materials • Regional training – IHEs, ESD, multiple SEA staff • Using integrity rubric

  44. Build a data system WA received $17.3 million for longitudinal data system -- requested meeting with data folks (SEA) -- have been meeting with Student Information Director and Data Governance Coordinator (SEA) -- also invited DSIA, Classroom Assessment Spectrum K12 Demo for Implementation team Reviewing training plans Meeting to draw up a “proof of concept” system

  45. Supports for Implementation RTI Coordinator mtgs (quarterly) Site visits (technical assistance) Will begin monthly contacts to provide additional supports Developing a series of trainings for all (will consist of face-to-face, webinar, and recorded sessions)

  46. Questions for RTI Coordinators How did you use your data this past year? Student-level data Building-level data District-level data Protocols in place? How is your data system driving implementation in your district???

  47. Feedback Pod 5. My SEA used data this past year to: • improve content, delivery, format, or target audience of PD • revise guidance on RTI framework • influence policy-makers, TA providers, SEA divisions • improve data collection • determine funding efforts • revise evaluation efforts • obtain technical assistance/support • other: _____________________ Check all that apply

  48. Intensive Technical Assistance w/ the National Center on RTI (MOU) • Common language • Alignment across efforts • Build capacity at the SEA, IHEs, ESDs, & districts • to support implementation of evidence-based • practices (EBPs) • Data system for demonstration districts • (expandable) • Information dissemination system to expand RTI • efforts across universities, districts and • professional organizations

  49. Effective State Support Structures for RTI Include… Leadership,consensus, and aligned vision Consistent messaging across departments Capacity-building supports for systems that support LEAs/schools (IHEs, ESDs), and for LEAs and schools directly Collection and use of data to inform decisions at all levels (student, classroom, school, LEA, SEA) Regulatory language, in general and/or special education, as appropriate, to support RTI From Innovations conference, 2010

  50. Invited to participate in the SISEP Community of Practice Putting together a team of folks from the SEA & ESDs --- - from Implementation team - from CSA group Our CSA group is now looking at “scaling up evidence-based practices”

More Related