1 / 23

Results Driven Accountability

Results Driven Accountability. SSIP evaluation tool. OFFICE of SPECIAL EDUCATION PROGRAMS. Improving Data, Improving Outcomes Conference New Orleans September 10, 2014 Gregg Corr Director Monitoring and State Improvement. Session objectives. Share draft versions of SSIP evaluation tool

aimeeg
Download Presentation

Results Driven Accountability

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Results Driven Accountability SSIP evaluation tool

  2. OFFICE of SPECIAL EDUCATION PROGRAMS Improving Data, Improving Outcomes Conference New Orleans September 10, 2014 Gregg Corr Director Monitoring and State Improvement

  3. Session objectives • Share draft versions of SSIP evaluation tool 2. Provide you with an opportunity to ask questions for clarification and give feedback

  4. OSEP’S Goal • That all States submit high quality Phase I SSIPs on April 1, 2015. • OSEP and its TA providers will provide differentiated support to assist States in meeting this goal.

  5. High quality ssip =Improved outcomes for children with disabilities!

  6. Don’t get stuck here!

  7. Support to States • SSIP Implementation Support Visits and Conference Calls • Support from OSEP-funded TA Centers • Materials posted on GRADS 360 https://osep.grads360.org/#program/highlighted-resources<https://osep.grads360.org/> • Center for Systemic Improvement (CSI) • SSIP Evaluation Tool

  8. SSIP Evaluation tool • Process for development. • This is a draft! We invite your input. • In addition to this rollout, we have scheduled a TA call for States on Thursday, September 11, 4:00 EDT.

  9. Overview of the tool • Based on the five components described in Phase I of the Measurement Table under Indicator C-11. • Data Analysis; • Analysis of Infrastructure to Support Improvement and Build Capacity • State-identified Measurable Result for infants and toddlers with Disabilities (SIMR); • Selection of Coherent Improvement Strategies; and • Theory of Action

  10. Overview of the tool • Each of the five components is composed of different elements • These elements will be analyzed by OSEP staff members to rate the quality of each component and the overall quality of Phase I of the State’s SSIP.

  11. Statutory requirements • As part of the State Performance Plan (SPP), the SSIP must meet requirements applicable to all indicators specified at section 616(b) of the IDEA.

  12. Statutory requirements For Indicators C11/B17 the State must provide: • FFY 2013 baseline data expressed as a percentage and aligned with the State-identified Measurable Results(s) (SIMR) for children with disabilities; • Measurable and rigorous targets (expressed as percentage) for each of the five years for FFY 2014 through FFY 2018, with the FFY 2018 target reflecting measurable improvement over the FFY 2013 baseline data. • A plan that includes a description of how the State will improve performance of the SIMR.

  13. SPP Approval OSEP will: • Review each State’s SPP, including the SSIP. • “Triage” each SPP to determine if the submission meets IDEA requirements. • Notify any State whose SPP does not meet legal requirements and provide opportunity for resubmission.

  14. Quality analysis of ssip • OSEP’s goal is that all States submit high quality SSIPs by April 1, 2015. • Primary purpose of OSEP’s Quality Analysis is to determine the extent to which each component clearly and comprehensively addresses each element in the SPP/APR Measurement Table.

  15. Important! • OSEP will not be using a State’s Phase I SSIP as a factor in the 2015 Determination Process. • OSEP’s review will form the basis for feedback to the State on the quality of its submission. • Phase 1 is the foundation of a 6 year improvement plan. It’s critical that we get it right!

  16. EVALUATION process • Each SSIP will be independently reviewed by OSEP evaluators using the SSIP Evaluation Tool. • After independent review, evaluators will meet to discuss their analyses of each component. • Evaluators will agree on an Overall Rating for the SSIP, identifying: • Areas of strength • Areas that for which State needs support

  17. Quality Rating scale Three part scale: HIGH QUALITY ADEQUATE QUALITY LOW QUALITY

  18. High Quality • All elements addressed • Comprehensive and well-developed • Clear description of process and decisions • Stakeholder engagement, multiple data sources, current research, and evidence-based practices • Proposed improvement strategies align with current State initiatives • Account for current strengths and needs • Decisions and strategies are logical and supported by evidence based research and State context. • Cohesion among elements and across components.

  19. Adequate quality • Essential criteria addressed, but: • Additional explanation is needed for clarity • Additional information is needed to ensure comprehensiveness • Initiatives need increased alignment • Decisions and proposed improvement strategies are logical, though additional refinement may be needed • There is cohesion among elements and component, although additional connections could be made.

  20. Low quality • Not all elements addressed • Component not well developed – lacks clear explanation and details • Limited stakeholder engagement, data sources, use of research-based strategies • Current State initiatives not reflected • Decisions and proposed strategies not logical. • Limited cohesion within and across elements, components

  21. Quality analysis worksheet The Worksheet provides prompts for elements in each of the five Phase I components. EXAMPLE Data Analysis: A description of - • How State identified and analyzed key data • How the data were disaggregated by multiple variables • Concerns about data quality • Consideration of compliance data • Stakeholder involvement

  22. Your input! • Have we captured the essential concepts of the Phase I SSIP? • What’s not clear? • Could you use this tool to self-assess your State’s work? • What additional TA would be helpful?

  23. Thank you! Send your comments, suggestions and questions to: gregg.corr@ed.gov

More Related