1 / 46

Response to Intervention, Problem Solving, and the 3 Tier Model Universal Data Collection and Assessment

Response to Intervention, Problem Solving, and the 3 Tier Model Universal Data Collection and Assessment. Ruth Poage-Gaines, IASPIRE Regional Coordinator 11-16-09 Presentation Materials from Mary Miller- IASPIRE . Acknowledgements .

tanner
Download Presentation

Response to Intervention, Problem Solving, and the 3 Tier Model Universal Data Collection and Assessment

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Response to Intervention,Problem Solving, and the 3 Tier Model Universal Data Collection and Assessment Ruth Poage-Gaines, IASPIRE Regional Coordinator 11-16-09 Presentation Materials from Mary Miller- IASPIRE

  2. Acknowledgements • Mark Shinn and the IASPIRE North Region Coordinators (Barb Curl, Christine Martin, Madi Phillips, Ben Ditkowsky, Pam Radford, Janice Miller, Christine Malecki) • D300- Carpentersville RtI Team- Mary Miller, Coordinator

  3. Expected Outcomes Familiarity with general assessment principles An understanding of how summative assessment differs from formative assessment An understanding of mastery measurement vs. general outcome measures Problem Identification through the referral system vs. universal screening data Norms vs. Standards based approaches to defining at risk populations Understanding Curriculum Based Measurement How to use CBM’s for program accountability

  4. Shift in approach from: • Assessment OF Learning • to • Assessment FOR Learning

  5. General Assessments • Types of Assessments • Screening • Screening tests identify at risk students according to a designated cut score • Formative • Formative assessment is ongoing • Summative • Summative assessment is often used at the end of major units of instruction and at year’s end • Diagnostic • Diagnostic assessments can be used for screening, or for formative of summative assessment

  6. General Assessment Principles • All assessment should be “planful” • Tests should be given to answer a specific question about a child’s performance • Use Summative and Formative Evaluation • Shift from what has been learned to what is being learned • Move the focus away from unalterable variables to alterable variables that educators can do something about

  7. Variables Related to Student Achievement

  8. Diagnostic Tests • Give information on specific skills that need to be taught • Take longer to administer and score • Best when tied to the curriculum and/or target important skills • Standardized diagnostic tests are often used for determining eligibility for programming

  9. General Assessment Principles • Mastery Measurement vs. General Outcome Measurement • Mastery measurement (i.e. Summative) is a measure of a child’s mastery of a concept or curriculum presented • General Outcome Measures (i.e. CBM) are not tied to a specific curriculum and measure progress on long term goals

  10. Tier 3Intensive, Individual Interventions • Individual Students • Assessment - based • High intensity • Of longer duration ACADEMIC SYSTEMS BEHAVIORAL SYSTEMS • Tier 3Intensive, Individual Interventions • Individual Students • Assessment - based • Intense, durable procedures 5% 5% 15% 15% • Tier 2Targeted Group Interventions • Some students (at-risk) • High efficiency • Rapid response • Tier 2Targeted Group Interventions • Some students (at-risk) • High efficiency • Rapid response • Tier 1Core Instructional Interventions • All students • Preventive, proactive • Tier 1Core Instructional Interventions • All settings, All students • Preventive, proactive 80% 80% 3-Tier Model STUDENTS

  11. Successful 3 Tier Models Have…. • A continuum of services and/or programs across the tiers that are scientifically based • Methods of identifying students at risk for academic failure and for evaluating/monitoring progress across the tiers, ideally those that are considered scientifically based • Efficient, COMMON methods of communicating student performance for all disciplines (i.e. progress monitoring)

  12. “If I had 1 hour to save the world, I would use 50 minutes to define the problem.”Albert Einstein

  13. A Problem Defined… • At Tier 3: • The difference between an individual student’s performance and a criterion of success in a curriculum area. • At Tier 2: • The difference between at-risk students’ performance and a criterion of success in a curriculum area. • At Tier 1: • The difference between how many students are proficient on their accountability assessments and 100%. The desired state is for all students to be proficient. (NASDSE, 2006)

  14. Identifying Student Need1. Universal Screening

  15. Identifying Student Need2. Referral-Driven

  16. Schools Use Specific Tools for Specific Assessment Purposes

  17. Universal Screening • The basic question in a screening measure is whether or not the student should be judged as “at risk” • For a screening measure to be useful, it should satisfy three criteria: • Accurately identify students who require further assessment • Be practical • Efficient use of resources

  18. Universal Screening Practices: • Universal Screening and Benchmarking Data is collected at the beginning of a school year. • School leadership team makes a decision about whether to use norms- or standards-based discrepancy for identifying problems. • Teams use the Data to make Decisions about potential problems. • Programs and Resources are Allocated to each of the 3-Tiers based on the Data.

  19. Use Benchmark for Universal Screening 2 Approaches to Identifying Students: 1. Norm-Based Approaches to Identify the Most Needy Students 2. Standards-Based Approaches to Identify Intensity of Programs and Progress Monitoring Frequency

  20. Methods of Measuring Performance Discrepancies • Norm-Based Approaches • Percentile Rank Cut Scores • Discrepancy Ratios (Tiers 2 and 3) • Standards-Based Approaches • Illinois AIMSweb Standards (Cut Scores for ISAT and Minnesota State Test) • Oregon DIBELS Standards (Cut Scores for Oregon State Test)

  21. Examples of Percentile Rank Norms

  22. Discrepancy Ratio Quantify how many times the student’s current level of performance varies from that of his/her peers. Compute By: Peer Median Target Student Median 90 30 = Discrepancy of 3x Will Need Problem Solving

  23. Norm-Based Criteria2nd Grade Discrepancy Tier 1 At Tier 1, 62% of 2nd grade students have met the expected criteria (55 WRC) compared to 80% nationally.

  24. Standard-Based Approaches • Illinois AIMSweb Standards Tied to ISAT and Minnesota State • Oregon DIBELS Standards

  25. General Outcome Measures from Other Fields Medicine measures height, weight, temperature, and/or blood pressure Federal Reserve Board measures the Consumer Price Index Wall Street measures the Dow- Jones Industrial Average McDonald’s measures how many hamburgers they sell

  26. Understanding General Outcome Measures (GOM)from Mark Shinn, Ph.D. & Michelle Shinn, Ph.D. • Measures important outcomes • General skill rather than individual sub skills • Contains a large pool of items • Measurable and observable • Sensitive to growth over relatively short periods of time • Valid and reliable measure

  27. What is Curriculum Based Measurement? • Education has its own set of indicators of general basic skill success (General Outcome Measures). Curriculum-Based Measurement allows us to make important statements about our students’ reading, spelling, written expression, and mathematics computation skills.

  28. AIMSweb • Web-based data management system • Organizes data • “Informs” the teaching and learning process by providing continuous student performance data • Reports improvements to students, parents, teachers, and administrators • Assessment data and interventions are closely linked

  29. Oral Reading Fluency (R-CBM), a standardized 1 minute sample of oral reading where the number of words read correctly is counted. (Grades 1-8) Reading (Maze-CBM), a multiple-choice cloze task that students complete while reading silently. The first sentence of a 150-400 word passage is left intact. Thereafter, every 7th word is replaced with three words inside parenthesis. (Grades 1-8) Phonics and Phonological Awareness (Early Literacy Measures), a standardized sample of fluency in initial sound identification, letter naming, and phonemic segmentation. (Grades K-1) Math Computation (M-CBM), a standardized 2-4 minute completion of computational problems where the number of correct digits is counted. (Grades 1-8) **May use at High School Level to identify at-risk and Progress Monitoring AIMSweb CBM Assessments

  30. AIMSweb CBM Assessments Early Numeracy (EN-CBM), a standardized sample of skills in oral counting, identifying missing numbers, number identification and quantity discrimination. (Grades K-1) Spelling (S-CBM), a standardized 2 minute spelling word dictation where the number of words spelled correctly or the number of correct letter sequences is counted. (Grades 1-8) Written Expression (WE-CBM), a standardized 2-4 minutes of writing after being provided a story starter where the total number of words written or the number of correct word sequences is counted. (Grades 1-8) MIDE Spanish Early Literacy – a standardized sample of letter naming fluency, letter sound fluency, syllable segmentation, syllable reading fluency, syllable and word spelling, and oral reading fluency. These measures require students to produce information in one minute with the exception of syllable and word spelling in which prompts are given every 20 seconds for two minutes.

  31. What Does R-CBM Measure? All of these skills…. General Reading Ability

  32. DIBELS/ ISEL R-CBM Running Record ITBS, etc. IRI, Gates, etc. Evaluating Core Reading Programs Assessing Reading Phonemic Awareness Phonics Fluency Vocabulary Comprehension http://www.nationalreadingpanel.org/

  33. • Motivation & Engagement • Active Reading Strategies • Monitoring Strategies • Fix-Up Strategies •Life Experience • Content Knowledge • Activation of Prior Knowledge • Knowledge about Texts • Oral Language Skills • Knowledge of Language Structures • Vocabulary • Cultural Influences Language Fluency* We Refer to It as General Reading Skills Reading Comprehension Metacognition Knowledge • Prosody • Automaticity/Rate • Accuracy • Decoding • Phonemic Awareness *modified slightly from presentations by Joe Torgeson, Ph.D. Co-Director, Florida Center for Reading Research; www.fcrr.org

  34. Educational Need Student Scores- Correct Words per Minute Above 90%ile 90%ile 75%ile Box Plot draws a box around the range of student scores: 169-43 50%ile 25%ile 10%ile Below 10%ile

  35. Progress MonitoringGeneral Education Benchmark Assessment

  36. Schools Use CBM in Universal Screening Instead of Referral Driven Practices < 25th Tier 2 Candidates <10th Individual Problem Solving and/or Tier 3 Candidates

  37. Strategic Monitoring of At Risk

  38. Frequent Monitoring toward Individualized Goals

  39. Local Assessments Correlated with Accountability Assessments Collect a large sample of scores from local assessments (e.g., R-CBM) and correlate with passing scores on accountability tests (e.g., ISAT) over time. Need AIMSweb or statistician to calculate correlations Correlations between test scores result in determining what minimum score is needed on local assessment to pass the state accountability measures

  40. Advantages of Using CBM for Accountability Assessments Measures are simple and easy to administer Measures are reliable and valid Training is quick Entire student body can be measured efficiently and frequently Routine testing allows schools to track progress during school year

  41. What Assessment Systems Does Your School Use for Each Purpose?

  42. Let’s Review General Assessment Principles Summative vs. Formative assessment Mastery Measurement vs. General Outcome Measures Problem Identification through the referral system vs. universal screening data Norms vs. Standards based approaches Understanding Curriculum Based Measurement How to read a box plot CBMs for program accountability

  43. It is better to know some of the questions than all of the answers. James Thurber

  44. Thank you… • Questions • Comments • For further information contact: rpoage@sedom.org Have a Great Thanksgiving

More Related