1 / 43

Curriculum-based Measures (CBM): The Cornerstone of the RTI Pyramid

Curriculum-based Measures (CBM): The Cornerstone of the RTI Pyramid. Long Island Association for Supervision and Curriculum Development Friday October 17, 2008. Lindenhurst Public Schools – The Team. Joseph LaMelza – Assistant Superintendent Donna Smawley – Principal, Bower School

eamon
Download Presentation

Curriculum-based Measures (CBM): The Cornerstone of the RTI Pyramid

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Curriculum-based Measures (CBM): The Cornerstone of the RTI Pyramid Long Island Association for Supervision and Curriculum Development Friday October 17, 2008

  2. Lindenhurst Public Schools – The Team Joseph LaMelza – Assistant Superintendent Donna Smawley – Principal, Bower School Roni Loud – Psychologist, Bower School Maria Bohrer – Reading Teacher, Bower School Carol Grasso – Kindergarten Teacher, Bower School Debra Mauro – Reading Teacher, Bower School

  3. Presentation Objectives • Understanding curriculum-based measures • Recognizing the importance of early literacy skill development • Identify factors that contribute to the effective implementation of RTI • Understanding the necessity of managing data • Sharing ideas and insights

  4. Essentials of Reading Instruction • Reading instruction MUST focus on instructional strategies that help overall reading ability, NOT only on isolated skills (Goodman;Pearson, 2006)

  5. What is CBM? • CBM is an approach to measuring the growth of student proficiency in the core educational skills that contribute to success in school. It is a fast, inexpensive, and easy-to-use system that allows teachers to continually measure their students’ growth in performance, determine if their students’ are growing at the expected rate, and provide data for teachers to evaluate their instructional strategies if students are not demonstrating adequate growth. Deno, Lembke, and Anderson

  6. CBM as General Outcome Measures - GOM • Relevant Features • Measure Big Ideas • Efficient • Standardized • Sensitive to growth and change over time and to the effects of intervention

  7. CBM Levels of Performance • Accuracy – barely able to do something without error, if you go slowly and concentrate • Fluency – can do something quickly without errors (no more than 5%). Fluency comes after accuracy and only with practice • Automaticity – can do something quickly without error and in the presence of distracters. Automaticity comes after fluency and with considerable practice.

  8. CBM = Improvement in RATE Fluency and automaticity are measured by rate (how fast it can be performed). Rate increases gradually as proficiency develops - which means it is measured over time. Improvement in rate is a measure of progress.

  9. READING RATE = COMPREHENSION • Reading connected text rapidly and accurately plays a crucial role in a student’s ability to comprehend. Rapid word recognition frees up cognitive resources for higher-level comprehension processes (Fuchs et al., 2001)

  10. ORAL READING FLUENCY IS RELATED TO OUTCOMES • Oral reading fluency predicted satisfactory comprehension skills with 80% accuracy for Grade 1 students and with 70% accuracy for Grade 2 students • Students with satisfactory oral reading fluency but low comprehension may have poor vocabulary skills • Students with good reading speed and accuracy but poor comprehension are the exception rather than the rule (Riedel, 2007)

  11. FLUENCY IS MORE THAN SPEED! The most legitimate use of oral reading speed is as Deno (1985) brilliantly conceptualized it; a way to monitor student progress. The danger of using speed as the measure is that some students and teachers focus on speed at the expense of understanding. Students need to simultaneously decode and comprehend using texts that increase in difficulty (Samuels, 2007)

  12. R-CBM as a Predictor • Oral reading fluency correlates highly with comprehension • .67 (Good et al., 2001) and .70 (Buck and Torgesen, 2003) with state reading assessment scores for Grade 3 • .73 with Stanford Achievement for Grade 1 (Cook, 2003) • .76 with Woodcock-Johnson Broad Reading Cluster (Roberts, 2005) for Grade 1

  13. What Makes a Big Idea a BIG IDEA? • Predictive of reading acquisition and later reading achievement • Something we can do something about, i.e., something we can teach • Something that improves outcomes for children if/when we teach it Graney, 2006

  14. BIG IDEAS • Phonemic Awareness • Alphabetic Principle • Accuracy and Fluency with connected text • Vocabulary • Comprehension National Reading Panel, 2000

  15. Early Literacy Probs / DIBELS • Most research is based on the body of knowledge regarding R-CBM • Early literacy probs were designed to be a downward extension of CBM before reading • Early literacy probs are short-term measures • Early literacy probs are in the CBM family, but are pre-skills *Don’t test on pre-skills when you can test the skill Shinn, 2008

  16. How can we use CBM to change Reading Outcomes? • Begin Early • Focus Instruction on the BIG IDEAS of Early Literacy • Focus Assessment on Outcomes for Students

  17. CBM in Practice The Big Ideas for Preventing Reading Failure • Increase the quality, consistency, and reach of instruction • Universal screening with timely and valid assessments of reading growth as progress monitoring – formative vs. summative assessment • Provide more intensive interventions to ‘catch up” the struggling reader Adapted from: Torgesen/Shinn, 2008

  18. REMEMBER . . . • CBM are indicators • CBM is a specific set of procedures • CBM is for evaluation of instruction. It does not require a specific instructional technique • Use of CBM formative evaluation increases student achievement. Graney, 2006

  19. Definition of RTI • High-quality instruction/intervention that is matched to students’ needs and has been demonstrated through scientific research and practice to produce high learning rates for most students • Learning rate and level of performance are the primary sources of information used in ongoing decision-making • Important educational decisions about intensity and duration of interventions are based on individual student’s response to instruction across multiple tiers of intervention. National Association of State Directors of Special Education, 2005

  20. Multi-Tiered Response FEW SOME ALL

  21. CORE Concepts of RTI • Research-based instruction – core programs are taught with fidelity as intended to maximize effectiveness. Instruction is focused on achieving state standards • Use of data to inform instruction – universal screening of all students to measure and to monitor the development of skills – provide program accountability • Measurement of response – progress monitoring is used to determine the effectiveness of interventions – it is systematic, documented, and shared with staff

  22. Interventions are NOT • Shortened assignments • Preferential seating • Parent contacts • Classroom observations • Suspensions • Doing more of the same assignments • Retention McCook, J., 2005

  23. Intervention Organized in Tiers • Layers of intervention responding to students’ needs • Each tier provides more intensive and supportive intervention • Aimed at preventing reading disabilities • Torgeson, 2004

  24. Multi-Tiered Response Literacy Strategic Monitor Progress Monitor Benchmarks CBM

  25. 3 Tier Model for RTI

  26. Benchmark Assessments Kindergarten Fall – Initial Sound Fluency (ISF), Letter Naming Fluency (LNF), Letter Sound Fluency (LSF) Winter – Letter Naming Fluency (LNF), Letter Sound Fluency (LSF), Phoneme Segmentation Fluency (PSF), Nonsense Word Fluency (NWF) Spring – Same as Winter

  27. Benchmark Assessments Grade 1 Fall – Letter Naming Fluency (LNF), Letter Sound Fluency (LSF), Phoneme Segmentation Fluency (PSF), Nonsense Word Fluency (NWF) Winter – Phoneme Segmentation Fluency (PSF), Nonsense Word Fluency (NWF), Oral Reading Fluency (ORF), Maze Spring – Same as Winter

  28. Benchmark Assessment – Cont’d Grade 2 – 5 • Oral Reading Fluency • Maze (Comprehension)

  29. Problem-solving Model – ISTProcess, not interventions, are standardized Individualized plan for each child that involves different levels of consultation: • Description of student’s problem • Data collection and problem analysis • Intervention design and implementation – differentiated instruction determined by data • Progress monitoring • Evaluation of intervention effectiveness • Flexible groupings throughout the year Wilson, 2007

  30. Three Levels of Assessment • Benchmark Assessment – 3 times a year • Are there children who need additional support? • How many? • Which children? • What to do? Evaluate benchmark assessment data • Progress Monitoring – - Assess at-risk children more frequently – every two weeks - Are current programs sufficient to keep progress on track or are additional supports / interventions needed? • Strategic Monitoring - weekly monitoring

  31. What decisions do we make with data? • Plan for support with focus on BIG IDEAS. • Grouping – small group instruction, homogenous groups, differentiated instruction, flexible grouping. • Time – How much? How Frequently? When? • Teacher / Student Interactions – modeling, direct explanation, increase student engagement, increase guided practice with immediate feedback, scaffolding to support learning, review

  32. Getting Started….. • Select a team – • Classroom teachers, reading specialists, psychologist, building principal, special education teacher(s), speech teacher, other. People that have a vested interest in reading and literacy outcomes. • Attend training sessions • Plan for data collection – • Who will collect data? • When will you collect data? • How will you collect data?

  33. Collecting Data • Plan and Schedule Data Collection • Organize Resources • Collect Data • Enter the Data • Use Data for Educational Decision Making

  34. Scheduling Data Collection • Classroom Approach – Obtain coverage for classroom teacher. Approximately 1-2 minutes per benchmark per student. Teacher works in hallway / room. Advantages – Teachers assess own students, less disruptive to entire school. Disadvantages – Loss of instructional time, coverage, requires more days. • Building-wide Approach – Multiple specialists / trained members of team will assess students. Teacher brings class to library, cafeteria, gym, or other location with tables. Entire class can be assessed in 30 minutes. Advantages – can be completed in one day, minimal classroom disruptions and loss of instructional time. Disadvantages – space, trained staff, teachers not assessing.

  35. Data Management System AIMS Web – Achievement Improvement Monitoring System www.aimsweb.com

  36. Instructional Recommendations

  37. Improvement Report

  38. Student Record

  39. Progress Monitoring Chart

  40. School Readiness for RTI • Assessment: screening measures, progress monitoring practices and procedures • Curriculum: high-quality, research-based core curricula • Instruction: focus on effective instruction and interventions

  41. School Readiness - Continued • Positive School Climate: school-wide processes and structures, individual student interventions, and a professional learning community • Professional Development: outcome focused content and ongoing assistance • Leadership: problem solving and individual characteristics of strong leaders Closing the Achievement Gap: School Readiness for RtI, Sopris West Educational Services, 2007

  42. See. . . Fuchs, L.S., & Fuchs, D. (2006) Best practice in progress monitoring reading and mathematics at the elementary level. In A. Thomas & J. Grimes (Eds). Best practices in school psychology V (pp. 2147 -2164). Bethesda, MD: National Association of School Psychologists. Hosp, M.K., Hosp, J.L. & Howell, K.W. (2007). The ABCs of CBM: A practical guide to Curriculum-based Measurement. New York, NY: Guilford. Miura, Wayman,M., Wallace, T., Ives Wiley, H., Ticha, R., & Espin, C. (2007). Literature synthesis on curriculum-based measurement in reading. The Journal of Special Education, 41(2), 85-120. Shinn, M.R. (2008). Best practices in Curriculum-based Measurement and its use in a Problem-solving model. In A. Thomas & J. Grimes (Eds.). Best practices in school psychology V (pp.243-262). Bethesda, MD: National Association of School Psychologists. Shinn, 2008

  43. Thank You for Your Attention and Participation

More Related