1 / 32

Linking Data to Instruction

Linking Data to Instruction. Jefferson County School District January 19, 2010. RTI Assessment Considerations. Measurement strategies are chosen to… Answer specific questions Make specific decisions Give only with a “purpose” in mind

homer
Download Presentation

Linking Data to Instruction

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Linking Data to Instruction Jefferson County School District January 19, 2010

  2. RTI Assessment Considerations • Measurement strategies are chosen to… • Answer specific questions • Make specific decisions • Give only with a “purpose” in mind • There is a problem if one doesn’t know why the assessment is being given.

  3. Types of Assessments • Screening Assessments - Used for ALL students to identify those who may need additional support (DIBELS, CBM, Office Discipline Referrals for behavior, etc.) • Formative Assessment/Progress Monitoring - Frequent, on-going assessments that shows whether the instruction is effective and impacting student skill development (DIBELS, CBM, etc) • Diagnostic Assessments - Pinpoint instructional needs for students identified in screenings (Quick Phonics Screener, Survey Level Assessments, Curriculum Based Evaluation Procedures, etc.) ALL PART OF AN ASSESSMENT PROCESS WITHIN RTI!

  4. Universal Screening Assessments • Universal screening occurs for ALL students at least three times per year • Procedures identify which students are proficient (80%) and which are deficient (20%). • Good screening measures: • Are reliable, valid, repeatable, brief, and easy to administer • Are not intended to measure everything about a student, but provide an efficient an unbiased way to identify students who will need additional support (Tier 2 or Tier 3) • Help you assess the overall health of your Core program • (Are 80% of your students at benchmark/proficiency?)

  5. Why Use Fluency Measures for Screening? • Oral Reading Fluency and Accuracy in reading connected text is one of the best indicators of overall reading comprehension (Fuchs, Fuchs, Hosp, & Jenkins, 2001) • We always examine fluency AND accuracy • Without examining accuracy scores, we are missing a BIG piece of the picture • Students MUST be accurate with any skill before they are fluent. Oral reading fluency (ORF) does not tell you everything about a student’s reading skill, but a child who cannot read fluently cannot fully comprehend written text and will need additional support.

  6. Linking Screening Data to Instruction • Questions to consider: • Are 80% of your students proficient based on set criteria (benchmarks, percentiles, standards, etc)? • If not, what are the common instructional needs? • i.e. fluency, decoding, comprehension, multiplication, fractions, spelling, capitalization, punctuation, etc • What is your plan to meet these common instructional needs schoolwide/grade-wide? • Improved fidelity to core? • More guided practice? • More explicit instruction? • Improved student engagement? • More professional development for staff?

  7. Progress Monitoring Assessments • Help us answer the question: • Is what we’re doing working? • Robust indicator of academic health • Brief and easy to administer • Can be administered frequently • Must have multiple, equivalentforms • (If the metric isn’t the same, the data are meaningless) • Must be sensitive to growth

  8. Screening/Progress Monitoring Tools: Reading • DIBELS PSF, NWF • Pros: Free, quick and easy, good research base, benchmarks, quick, linked to instruction • Cons: Only useful in Grade K-2 • ORF (DIBELS, AIMSWEB, etc) • Pros: Free, good reliability and validity, easy to administer and score • Cons: May not fully account for comp in a few students • MAZE • Pros: Quick to administer, may address comprehension more than ORF, can administer to large groups simultaneously, useful in secondary • Cons: Time consuming to score, not as sensitive to growth as ORF • OAKS • Pros: Already available, compares to state standards • Cons: Just passing isn’t good enough, not linked directly to instruction, needs to be used in conjunction with other measures

  9. Screening/Progress Monitoring Tools: Math • CBM Early Numeracy Measures • Pros: Good reliability, validity, brief and easy to administer, • Cons: Sensitivity to growth, only useful in K-2 • Math Fact Fluency • Pros: Highly predictive of struggling students • Cons: No benchmarks, only a small piece of math screening • CBM Computation • Pros: Quick and easy to administer, sensitive to growth, surface validity • Cons: Predictive validity questionable, not linked to current standards • CBM Concepts and Applications • Pros: Quick and easy to administer, good predictive validity. Linked to NCTM Focal Points (AIMSWEB) • Cons: Not highly sensitive to growth, newer measures • easyCBM • Pros: Based on NCTM Focal Points, computer-based administration and scoring • Cons: Untimed (does not account for fluency), lengthy (administer no more than once every 3 weeks), predictive validity uncertain

  10. Screening/Progress Monitoring Tools: Writing • CBM Writing • Pros: Easy to administer to large groups, can obtain multiple scores from single probe • Cons: time consuming to score, does not directly measure content of writing • Correct Writing Sequences (CWS, %CWS) • Pros: Good reliability, validity, sensitive to growth at some grade levels • Cons: Time consuming to score, not as sensitive to growth in upper grades, %CWS not sensitive to growth • Correct Minus Incorrect Writing Sequences (CIWS) • Pros: Good reliability, validity, sensitive to growth in upper grades • Cons: Time consuming to score, not sensitive to growth in lower grades

  11. Screening & Progress Monitoring Resources • National Center Response to Intervention (www.rti4success.org) • Intervention Central (www.interventioncentral.com) • AIMSweb(www.aimsweb.com) • DIBELS (https://dibels.uoregon.edu) • easy CBM (www.easycbm.com) • The ABC’s of CBM (Hosp, Hosp,& Howell, 2007)

  12. Diagnostic Assessments • The major purpose for administering diagnostic tests is to provide information that is useful in planning more effective instruction. • Diagnostic tests should only be given when there is a clear expectation that they will providenew information about a child’s difficulties learning to read that can be used to provide more focused, or more powerful instruction.

  13. Diagnostic Assessment Questions • “Why is the student not performing at the expected level?” • “What is the student’s instructional need?” Start by reviewing existing data

  14. Diagnostic Assessments • Quick Phonics Screener (Hasbrouck) • DRA • Error Analysis • Survey Level Assessments • In-Program Assessments (mastery tests, checkouts, etc) • Curriculum-Based Evaluation Procedures • "any set of measurement procedures that use direct observation and recording of a student’s performance in a local curriculum as a basis for gathering information to make instructional decisions”(Deno, 1987) • Any informal or formal assessments that answer the question: Why is the student having problems?

  15. The Problem Solving Model • Define the Problem: • What is the problem and why is it happening? • Design Intervention: • What are we going to do about the problem? • Implement and Monitor: • Are we doing what we intended to do? • Evaluate Effectiveness: • Did our plan work?

  16. Using the data to inform interventions • What is the student missing? • What does your data tell you? • Start with what you already have, and ask “Do I need more info?”

  17. Using your data to create interventions: An Example Adapted from

  18. Organizing Fluency Screening Data:Making the Instructional Match Regardless of the skill focus, organizing student data by looking at accuracy and fluency will assist teachers in making an appropriate instructional match!

  19. Digging Deeper with Screening Data • Is the student accurate? • Must define accuracy expectation • Consensus in reading research is 95% • Is the student fluent? • Must define fluency expectation • Fluency Measuring Tools: • Curriculum-Based Measures (CBM) • AIMSWeb (grades 1 - 8) • Fuch’s reading probes (grades 1 - 7) • DIBELS (grades K - 6)

  20. Organizing Fluency Data:Making the Instructional Match Group 1: Dig Deeper in the areas of reading comprehension, including vocabulary and specific comprehension strategies. Group 2: Build reading fluency skills. (Repeated Reading, Paired Reading, etc.) Embed comprehension checks/strategies. Group 3: Conduct an error analysis to determine instructional need. Teach to the instructional need paired with fluency building strategies. Embed comprehension checks/strategies. Group 4: Conduct Table-Tap Method. If student can correct error easily, teach student to self- monitor reading accuracy. If reader cannot self- correct errors, complete an error analysis to Determine instructional need. Teach to the instructional need. • Core Instruction • *Check Comp* • +Fluency building • +Decoding then fluency • Self-Monitoring

  21. Data Summary 3rd Grade Class- Fall DIBELS: ORF => 77

  22. Day 4’s Activity 5 accuracy wpmc and • ACTIVITY: • Based on criteria for the grade level, place each student’s name into the appropriate box. • Organizing data based on performance(s) assists in grouping students for instructional purposes. • Students who do not perform well on comprehension tests, have a variety of instructional needs.

  23. Match the Student to the Appropriate Box: >95% acc. And 77 wcpm. Jerry Jim Mary Nancy Ted

  24. Regardless of Skill… ACCURACY Matters! • Phonemic Awareness • Letter Naming • Letter Sounds • Beginning Decoding Skills • Sight Words • Addition • Subtraction • Fractions

  25. Instructional “Focus” Continuum

  26. Digging Deeper • In order to be “diagnostic” • Teachers need to know the sequence of skill development • Content knowledge may need further development • How deep depends on the intensity of the problem. OR

  27. Phonemic Awareness Developmental Continuum Vital for Diagnostic Process! Hard • Phoneme deletion and manipulation • Blending and segmenting individual phonemes • Onset-rime blending and segmentation • Syllable segmentation and blending • Sentence segmentation • Rhyming • Word comparison IF DIFFICULTY DETECTED HERE.. THEN check here! Easy

  28. Screening Assessments:Not Always Enough • Screening assessments do not always go far enough in answering the question: • We will need to “DIG DEEPER!” • Quick phonics screener • Error Analysis • Curriculum Based Evaluation Digging Deeper!

  29. When does this happen? • Tier 1 Meetings

  30. When does this happen? • Tier 2 Meetings

  31. When does this happen? • Tier 3 (Individual Problem Solving) Meetings

  32. Useful Resources • What Works Clearinghouse • http://ies.ed.gov/ncee/wwc/ • Florida Center for Reading Research • http://www.fcrr.org/ • National Center on Response to Intervention • http://www.rti4success.org/ • Center on Instruction • http://www.centeroninstruction.org/ • Oregon RTI Project • http://www.oregonrti.org/ • Curriculum Based Evaluation: Teaching and Decision Making (Howell & Nolet, 2000) • The ABCs of CBM (Hosp, Hosp & Howell, 2007)

More Related