1 / 60

Data Based Decision Making

Data Based Decision Making. Reading Review . Stanovich, 2010 Fuchs & Fuchs -- Progress Monitoring. "It ain't so much the things we don't know that get us into trouble. It's the things we know that just ain't so." -Josh Billings.

yuri
Download Presentation

Data Based Decision Making

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Data Based Decision Making

  2. Reading Review Stanovich, 2010 Fuchs & Fuchs -- Progress Monitoring

  3. "It ain't so much the things we don't know that get us into trouble. It's the things we know that just ain't so." -Josh Billings Perhaps the second most famous humor writer and lecturer in the United States in the second half of the 19th century after Mark Twain

  4. We Never Know for sure… • Even practices with the best research base… may not work for some students. • So… if you are using a research based intervention – implement & COLLECT DATA! • And… if you are struggling to identify a research-based intervention – implement & COLLECT DATA!

  5. Critical Concept:Data Based Decision Making Continuous, purposeful process of collecting, interpreting, presenting and using data to inform actions that support positive educational outcomes. Data based decision making considers the learner’s progress within the contexts of instruction, curriculum and environment.

  6. Necessary components of Assessment • When a student is experiencing difficulty, several related & complementary types of assessment should be performed • Assessment of the Learner (Student) • Assessment of Instruction (or Intervention) Curriculum and Environment Learner Instruction/ Intervention Curriculum Environment

  7. Measuring -ICEInstruction, Curriculum, Environment • What questions might you have about the instruction/intervention or curriculum? • Are the instructional/interventions methods research based? • Implementation fidelity? • Is the classroom environment suitable to learning • Time on task • Instructional time • Academic engaged time • Opportunities to Respond & % Correct Responses • Positive to Negative Ratio • Student problem behavior

  8. Models for Data Based Decision Making Problem Solving Models & Outcomes Driven Models

  9. Supporting Social Competence & Academic Achievement OUTCOMES Supporting Decision Making Supporting Staff Behavior DATA SYSTEMS PRACTICES Supporting Student Behavior

  10. Outcomes Driven Model • In an Outcomes Driven Model, the bottom line is achievement of essential educational or social outcomes • What are the desired outcomes? • Are students attaining the necessary skills to be successful? • If not, what changes can we make? • Are the changes increasing student progress?

  11. Research Based Frameworks Needed • How do we know what to measure & when? • Reading • RTI & Big 5 Ideas of Reading • Math • RTI • Behavior • PBIS, Function of Behavior & ABA

  12. Big 5 Ideas of Reading Reading Comprehension Vocabulary Oral Reading Fluency & Accuracy Phonics (Alphabetic Principle) Phonemic Awareness

  13. 3. Accurately identify those who are on track and those who will need more support We must identify struggling students, BEFORE they fall too far behind Good, Simmons, & Smith (1998)

  14. Response to Intervention • Intensive, Individual Interventions • Individual Students • Assessment-based • High Intensity • Intensive, Individual Interventions • Individual Students • Assessment-based • Intense, durable procedures • Targeted Group Interventions • Some students (at-risk) • High efficiency • Rapid response • Targeted Group Interventions • Some students (at-risk) • High efficiency • Rapid response • Universal Interventions • All students • Preventive, proactive • Universal Interventions • All settings, all students • Preventive, proactive Academic Systems Behavioral Systems 1-5% 1-5% 5-10% 5-10% 80-90% 80-90% Circa 1996

  15. Identify Problems Team Initiated Problem Solving (TIPS) Model Develop Hypothesis Evaluate and Revise Action Plan Collect and Use Data Discuss and Select Solutions Develop and Implement Action Plan Problem SolvingMeeting Foundations

  16. Purposes of Assessment • Screening • “Which students need more support?” • Progress Monitoring • “Is the student making adequate progress?” • Diagnostic • “What and how do we need to teach this student?” • Outcome • “Has our instruction been successful?”

  17. Outcomes Driven Model Screening Outcome Screening Diagnostic Diagnostic Progress Monitoring

  18. Effective Data Collection

  19. Use the right tools for the right job Screening Progress Monitoring Diagnostic Assessment Outcomes

  20. Use Good ToolsTechnically Adequate • Reliability = Consistency • The extent that an assessment will be consistent in finding the same results across conditions (across different administrators, across time, etc.) • If same measure is given several times to the same person, their scores would remain stable & not randomly fluctuate

  21. Use Good Tools Technically Adequate Draw a line to Match the letters: A f U p w w E A f I U v B p • Validity = extent that an assessment measures what it is supposed to measure • First we need to know what we should be measuring! • Research Based Frameworks for Measurement • Students who do well on valid reading tests are proficient readers • Valid = assessing reading by having the student read a passage aloud and monitoring errors and rate • Not Valid = assessing reading by having a student match printed letters on a page (this is an assessment matching visual figures)

  22. Use Good ToolsA Concern for self-developed assessments • Technical Adequacy can be a problem with self-developed measures • Challenge with Professional Learning Team model • Which often rely on teacher-developed assessments to measure important student outcomes & guide decision making

  23. Low Inference • Students are tested using materials that are directly related to important instructional outcomes • Low inference • Making judgments on a child’s reading skills based on listening to them read out loud. • High inference • Making judgments on a child’s emotional state based on pictures they’ve drawn

  24. Use the tools correctly Standardized Administration • Administered, scored, and interpreted in the same way • Directions given to students are consistent • Student responses are scored in the same way • Every student has the exact same opportunity on the assessment

  25. Efficiency • Time is precious in classrooms, efficiency is an important consideration • When evaluating efficiency of an assessment tool, we must consider: • Time & personnel required to design, administer and score assessment tools

  26. Screening

  27. 1. Compare ALL students to the same grade-level standard ALL students are assessed against the grade level-standard, regardless of instructional level "If you don't know where you are going, you will wind up somewhere else.“ ~ Yogi Berra

  28. 2. Be efficient, standardized, reliable, and valid • Robust indicator of academic health • Brief and easy to administer • Can be administered frequently • Must have multiple, equivalent forms (If the metric isn’t the same, the data are meaningless) • Must be sensitive to growth

  29. 3. Accurately identify those who are on track and those who will need more support We must identify struggling students, BEFORE they fall too far behind Good, Simmons, & Smith (1998)

  30. 4. Evaluate the quality of your schoolwide instructional system • Are 80% of your students proficient? • Are 80% of students reaching benchmarks and “on track” for next goal? If not, then the core curriculum needs to be addressed

  31. What are Screening Tools? Screening Tools Not Screening Tools Quick Phonics Screener QRI-IV DRA2 Running Records Report cards Meeting OAKS standards Core curriculum weekly tests on skills that are learned • DIBELS • Oral Reading Fluency • Maze • EasyCBM • CBM Math Computation • CBM Writing – Story Starters • CBM Algebra • CBM Early Numeracy

  32. One Page of a 3-Page CBM in Math Concepts and Applications (24 Total Blanks)

  33. Previous Years Discipline data Who needs to be on our radar from Day 1? Who had FBA/BSP’s last year? Which students moved on? Which are returning this year? Can we get data for our incoming class & new students? Decision Rule

  34. Progress Monitoring

  35. Progress Monitoring Tools Brief & Easy Sensitive to growth Equivalent forms Frequent

  36. Our Goal Desired Course We are Here Actual Course Where are we? What is our goal? What course should we follow? How are we doing?

  37. Progress Monitoring: The GPS for Educators!

  38. Purpose of Progress Monitoring Answers the question(s): Are the children learning? How can we tell? Are they making enough progress? Can we remove some of our supports? Do we need to change or intensify our supports?

  39. How often do you progress monitor students? Determined by district decision rules and level of need • Best practice recommendations: • Intensive: 1-2 x per week • Strategic: 1x or 2x per month

  40. How do we know if a student is making adequate progress? Decision Rules Correct words per Minute

  41. Questions to Consider • How many data points below the line before you make a change in instruction/intervention? • What do you change? • Group size? • Time? • Curriculum? • Other factors?

  42. Progress Monitoring Phonics for Reading 30 34 35 38 27 25 31 32

  43. We do not use progress monitoring data to… …select specific short-term instructional goals …take a lot of time away from instruction …diagnose educational problems …assign grades to students …evaluate teachers

  44. What are Progress Monitoring Tools? Progress Monitoring Tools Not Progress Monitoring Tools Quick Phonics Screener QRI-IV DRA2 Running Records Report cards Meeting OAKS standards Core curriculum weekly tests on skills that are learned • DIBELS • Oral Reading Fluency • Maze • EasyCBM • CBM Math Computation • CBM Writing – Story Starters • CBM Algebra • CBM Early Numeracy

  45. Progress Monitoring data tell us WHEN a change is needed Progress Monitoring data does not always tell us WHAT change is needed

  46. Point Card

  47. Look at Individual Student graph for Targeted Student(s)

  48. Diagnostic Assessment Answer the question…. Why? WARNING! Critical Thinking Skills may be Required

  49. Collecting Diagnostic Data • The major purpose for administering diagnostic tests is to provide information that is useful in planning more effective instruction. • Diagnostic tests should only be given when there is a clear expectation that they will providenew information about a child’s difficulties learning to read that can be used to provide more focused, or more powerful instruction.

  50. Diagnostic Assessment Questions “Why is the student not performing at the expected level?” (Defining the Problem) “What is the student’s instructional need?” (Designing an Intervention)

More Related