1 / 23

Is Your School Improving Outcomes for Students with Disabilities: Guesswork or Science?

Is Your School Improving Outcomes for Students with Disabilities: Guesswork or Science?. Presented by The Elementary & Middle Schools Technical Assistance Center The American Institutes for Research. EMSTAC Model: Insider – Outsider Approach. Linking Agent inside the district. TA Liaison

wynn
Download Presentation

Is Your School Improving Outcomes for Students with Disabilities: Guesswork or Science?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Is Your School Improving Outcomesfor Students with Disabilities: Guesswork or Science? Presented by The Elementary & Middle Schools Technical Assistance Center The American Institutes for Research

  2. EMSTAC Model:Insider – Outsider Approach Linking Agent inside the district TA Liaison outside the district TA Support School District

  3. Presentation Goals • Provide an overview of the importance of evaluation efforts for school-based interventions involving students with disabilities • Identify key steps in planning for and conducting an evaluation • Show EMSTAC district “success stories,” demonstrating effective interventions

  4. The Problem • Most schools have multiple programs or interventions occurring; many are targeted exclusively or partly at students with disabilities. • In many cases, however, the importance of a sound evaluation effort can be overlooked, underappreciated, and difficult to achieve.

  5. Common Evaluation Challenges • No “true” baseline data • Not enough data have been collected • Comparable data have not been collected • Not enough time to collect desired data • Longitudinal gaps (missing years) • Separating the effects of one intervention from another

  6. What Happens When You Don’t Evaluate? • Inability to demonstrate that interventions are working • Difficulty in obtaining sustained funding • Problems with “scaling up” the intervention to other sites • Dissemination of best practices to other districts & states is hindered

  7. Evaluation is More Important Than Ever • The “era of accountability:” Increased pressure from governing bodies to demonstrate results • Too much focus on “just make things better” • Budget difficulties at federal, state, & local levels • IDEA ‘97: Students with disabilities must be included in accountability efforts

  8. Program planning and evaluation go hand in hand Tailor to program purpose & goals Identify the purpose of the evaluation requirements Consider program reporting requirements Think ahead about available resources Be sensitive to the local context for the evaluation Identify program reporting First Step:Getting a View of the Big Picture

  9. Program Purpose & Goals Program Reporting Requirements Evaluation Purpose Evaluation Local Context Time & Resources The Big Picture

  10. Questions related to program implementation Program context Program delivery Access to the program Questions related to program impact Impact on student performance Impact on teacher capacity Impact on moving research to practice Second Step:Identifying the Evaluation Questions

  11. Third Step:Selecting a Design that Will Provide the Data You Need • Experimental designs • Quasi-experimental designs • Simple before & after studies • Time series designs • Ethnographic research

  12. Direct observation Records & documents Physical artifacts Information from school administrators, teachers, students, & parents Special Considerations Time Reliability & validity Training data collectors Permission to collect data & informed consent Fourth Step:Identify & Develop Tools for Collecting Data

  13. Fifth Step:Data Analysis • Storing data • Quantitative analyses: Descriptive and univariate statistics • Quantitative analyses: Inferential statistics • Qualitative analysis

  14. Sixth Step:Reporting the Findings • Formal and informal reports • Targeting the audience • Keep it simple and straightforward • Brief reports • Website • Public reports • Support program improvement

  15. EMSTAC “Success Stories” • Implementation looks different in each district • various resources, time, staff, & experience with evaluation • Examples of EMSTAC supported sites with positive student outcome results • Allegany County, MD • Detroit, MI • East Grand, CO • Los Angeles, CA

  16. Allegany County, Maryland • Planning • Evaluation planned after program implementation • Evaluation questions • Will the program and linked professional development help general education teachers make instruction meaningful for all students? • Are we going to see improvement in test scores? • Design • Simple before and after design (Norm-referenced)

  17. Allegany County, Maryland • Data collection • Test results (i.e. CTBS, Curriculum Based Measures) • Teacher surveys • Data analysis • Analyze final scores • Challenges with student attrition • Reporting • Published data in department newsletter • Plan scale up efforts

  18. Allegany County, Maryland • Early Literacy Program • EMSTAC assisted with needs assessment, in 1998 • Implemented portions of Early Literacy Program (MSU) • Training occurred in summer 1999 • Data collected are based on results of CTBS pre/post scores (1998/2000). • Language : 43 to 59.5 (28%) • Lang. Mechanics : 43 to 54.0 (20%) • Lang. Composition : 47 to 57.0 (18%)

  19. Detroit, Michigan • Project ACHIEVE • Internal needs assessment, program selection & training process took a year • 1999 began implementation of program in one middle school • by 2000-2001, school saw decrease in violations of code of conduct • Class 1 referrals : 1,914 to 931 (51%) • Class 2 referrals : 394 to 227 (42%) • Class 3 referrals : 18-8 (56%).

  20. East Grand, Colorado • Literacy across curriculum • District conducted needs assessment • EMSTAC assisted with program selection via web • Implementation began in 1999 in one school • Results are based on pre-post test results (1999, 2001) from CO State Assessment Program • Reading : 5% (70-75) • Math : 16% (50% to 66%) • Writing unchanged

  21. Los Angeles, California • Peer Assisted Learning Strategies • District conducted needs assessment • PALS implementation began in 2000 • Measures included CBM tools and probes (fall and spring) • Probes indicated that during 2000, all grade levels using program saw increases in WCPM (Words read correctly per minute) • Thus far, second grade classrooms have had highest gains of between 40% to 97.

  22. Conclusion • Evaluation is increasingly important. • There are a set of key principles that guide sound evaluation efforts. • You don’t have to be an expert to organize & conduct a sound evaluation. • There are many useful resources for practitioners undertaking evaluation efforts (i.e. EMSTAC Evaluation Guide)

  23. Presenters • Jim Hamilton, EMSTAC Director • Don Dailey • Bradley Carl • Suzanne Ritter • e-mail us at www.emstac.org

More Related