1 / 59

Interpreting and Using DIBELS Next Data

Interpreting and Using DIBELS Next Data. Presented by April Kelley. Introductions – That’s Me. I’ve used DIBELS for 1-3 years 3-5 years 5+ years I know how to give the assessment I know how to read the reports I know how to use the results to change my instruction. Seasonal Partners.

tacey
Download Presentation

Interpreting and Using DIBELS Next Data

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Interpreting and Using DIBELS Next Data Presented by April Kelley

  2. Introductions – That’s Me • I’ve used DIBELS for • 1-3 years • 3-5 years • 5+ years • I know how to give the assessment • I know how to read the reports • I know how to use the results to change my instruction

  3. Seasonal Partners

  4. Essential Questions • What questions do we have about giving the assessment? • How do we use DIBELS with an outcomes-driven model? • Identify Need • Validate Support • Plan Support • Evaluate Support • Evaluate Outcomes

  5. What do you like about mclass?

  6. What logistical questions do you have about administering DIBELS Next?

  7. DIBELS Next & The Big 5 Puzzle Activity

  8. General Reporting Features Video

  9. Explore Scavenger Hunt Ask April if you get stumped 

  10. How do we use DIBELS with an outcomes-driven model? Identify the Need for Support Benchmark Assessment Validate the Need for Support Plan Support Evaluate Effectiveness of Support Implement Support Progress Monitoring Benchmark Assessment Review Outcomes

  11. Outcomes-Driven Model

  12. Reports(To help identify student needs)… • Your Students’ Instructional Recommendations • DIBELS Next Measure Breakdown • Growth this Year • Finding Students in Need • Seeing Achievement Gaps

  13. Processing Time Look at some of these reports and take 10 min. to analyze your data • What do we know from the data? • Which students may need additional support?

  14. Outcomes-Driven Model

  15. Validate the Need for Support Are we reasonably confident the student needs instructional support? • Rule out any reasons for poor performance such as bad day, confused on directions, ill, shy, etc. What data can you use? • Repeat assessments using progress monitoring probes • At least 2 more times, not on same day but within 1 week

  16. Outcomes-Driven Model

  17. Big Ideas & Instructional Goals Instructional goals should be guided by the 5 Big Ideas: • Phonemic Awareness • Alphabetic Principal (Phonics) • Accuracy and Fluency with Connected Text (Fluency) • Vocabulary • Reading Comprehension

  18. Considerations in Planning Instruction • Whole Group Instruction • Are my students on track? • What do I need to target for my instruction?

  19. Try it out… Kindergarten Classroom • Spring • 50% at benchmark on PSF and NWF • Goal??? (winter partner)

  20. Try it out… First Grade Classroom • 80% at benchmark on ORF • 40% on NWF • Goal??? (winter partner)

  21. Try it out… Second Grade Classroom • Fall • 90% benchmark on NWF • 40% benchmark on ORF • Goal??? (spring partner)

  22. Try it out… Fourth Grade Classroom • 65% benchmark on Composite Score • 90% benchmark on DORF Fluency • 60% benchmark on DORF Accuracy • 55% benchmark on DAZE • Goal??? (spring partner)

  23. Try it out… Sixth Grade Classroom • 80% benchmark on Composite Score • 60% benchmark on DORF Fluency • 95% benchmark on DORF Accuracy • 90% benchmark on DAZE • Goal??? (spring partner)

  24. Reports(to help plan whole group instruction)… • DIBELS Next Measure Breakdown

  25. Considerations in Planning Instruction • Small Group Instruction • Which students have similar skill strengths and weaknesses? • How can I group students for the instruction they need?

  26. How will students be grouped for instruction? • Students with same composite score or overall instructional recommendation DO NOT necessarily have the same instructional needs. • Students who have scores within the same range on a measure DO NOT necessarily have the same instructional needs.

  27. Grouping Students • Analyze student performance across all measures • Group students with similar instructional needs • Its important to consider how each DIBELS Measure relates to the BIG Ideas of reading instruction and to each other

  28. Considerations for Groupings • You MUST look at the scoring protocol – a number is NOT enough information for grouping purposes • Ask yourself • Is the student accurate but slow? • How accurate? • Are there any error patterns? • Is a problem fluency-based? • Is the student making multiple errors and performing at a slow pace?

  29. Considerations for Groupings • Are additional diagnostic assessments, placement tests, and/or work samples needed? • What student factors do I need to consider? (behavioral needs, attendance, etc) • What personnel resources do I have and what does my schedule/time allotment for instruction look like?

  30. Sample esu6-readingnews.wikispaces.com/ Grouping Worksheets

  31. Reports(to help plan for small group instruction)… • DIBELS Next Measure Breakdown • Finding Students in Need • Seeing Achievement Gaps

  32. Considerations in Planning Instruction • Individual Instruction • Which students might be ask risk for reading failure without intense interventions? • How do we prioritize the skills they need?

  33. Individual Student Problem Solving Agenda April’s Sample Agenda Steps #1-4

  34. Digging Deeper • Look at student error patterns o for additional direction. • Example 1: Are students not reaching benchmark on NWF because they don’t know letter-sound correspondences or because they are not blending sounds together? • Example 2: Are students not reaching benchmark on ORF because they are accurate but not fluent OR because they have low fluency?

  35. What skills shouldwe teach? (summer partner) Scenario Review • What if a student is low on First Sound Fluency and Phoneme Segmentation Fluency? • Target Phonemic awareness • What if a students is low on Nonsense Word Fluency? • If NWF accuracy is below 97%, target accuracy w/ beginning phonics • If NWF accuracy is at/above 97%, but low recoding, target blending with phonemic awareness blending skills • If NWF accuracy is at/above 97%, target building automaticity (fluency)

  36. What skills shouldwe teach? (fall partner) Scenario Review • What if a student is low on oral reading fluency? • Target fluency with connected text if accuracy is greater than 95% • Target alphabetic principle if accuracy is less than 95% • Target comprehension and/or vocabulary if student is making meaning distortion errors • What if a student is low on ORF + DAZE • Teach fluency & comprehension

  37. Reports(to help plan Individual student instruction)… • Finding Students in Need

  38. Application Time • Using one of the suggested reports, look for students that have similar skills needs. • See if you can think through some students that may be in the same groups. • Determine what they need for skill instruction.

  39. Outcomes-Driven Model

  40. How will we know if the interventions are effective for individual students?

  41. Progress Monitoring Decision Rules • Which students will we monitor? • How often will they be monitored? • Who will monitor them? • What will we do with data?

  42. Sample Graph #1 • What do we need to do with this student?

  43. Sample Graph #2 • What do we need to do with this student?

  44. Sample Graph #3 • What do we need to do with this student?

  45. Sample Graph #4 • What do we need to do with this student?

  46. Individual Student Problem Solving Agenda April’s Sample Agenda Step #5

  47. Reports(To help evaluate individual student support effectiveness)… • DIBELS Effectiveness Formula • Your Students’ Instructional Recommendations

  48. Outcomes-Driven Model

More Related