1 / 128

Data-Based Decision Making

Data-Based Decision Making. Openings & Introductions. Session-at-a-glance Introductions Training Norms Learner Objectives Pre-Session Readings and Essential Q uestions. Session-At-A-Glance. Overview of the data-based decision making (DBDM) (30 minutes)

rosine
Download Presentation

Data-Based Decision Making

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Data-Based Decision Making

  2. Openings & Introductions • Session-at-a-glance • Introductions • Training Norms • Learner Objectives • Pre-Session Readings and Essential Questions

  3. Session-At-A-Glance • Overview of the data-based decision making (DBDM) (30 minutes) • 6-steps of the DBDM process (4 hours) • Why? • What? • Implementation fidelity indicators for each step • Team action planning for each step • Action Planning for entire DBDM process (30 minutes)

  4. Introductions • Consultant add information needed regarding introductions for your session • Select and use an inclusion activity at consultant discretion

  5. Training Norms • Begin and end on time • Be an engaged participant • Be an active listener—open to new ideas • Use notes for side bar conversations • Use electronics respectfully

  6. Learner Outcomes • Teacherlearns how data-based decision making allows for demonstration of Missouri Teacher Standards • Teacher develops knowledge and applies steps of DBDM “Cycles” (Data Teams) with example data sets: • Develop classroom system of data collection and charting • Analyze and disaggregate student learning • Establish student goals based on results • Select instructional practices • Determine results indicators (cause) and product (effect / social emotional and behavioral) • Design ongoing monitoring of results (monitor, reflect, adjust, repeat) • Review results indicators • Review implementation: • Instructional practices • Data cycle

  7. Learner “Post” Objectives 3. Teacher utilizes steps of DBDM “Cycles” with their classroom data • Teacher will collect, chart, analyze and disaggregate student learning data as well as implementation data • Teacher will explain results indicators for process (cause) and product (effect) • Teacher will design ongoing monitoring of results (monitor, reflect, adjust, repeat)

  8. Preparatory Reading Reflection “C” Using Student Data to Support Instructional Decision Making • Review the 5 recommendations in the IES Practice Guide Summary • Mark with a star which of those recommendations and specific steps, with support, you as a classroom teacher can work to implement into your professional practice • When directed share your starred items with a shoulder partner

  9. “D” Preparatory Reading Reflection First Things First: Demystifying Data Analysis Mike Schmoker poses 2 essential questions for educators to answer: • How many students are succeeding in the subject I teach? • Within those subjects, what are the areas of strengths and weakness? • How do you or your grade level or departmental team answer these questions now? • How can the answers to these questions efficiently drive instructional decision making at the classroom, grade level and/or departmental level?

  10. Essential Questions • How many students are succeeding in the subject I/we teach? • Within those subjects, what are the areas of strengths and weakness? • How can I/we establish and sustain a culture and process for strategic instructional decision making across our building, teams and classrooms?

  11. “Connecting the dots” when you are feeling overwhelmed! How does data-based decision making allow teachers to simultaneously improve student outcomes while also demonstrating knowledge and fluency with Missouri Teacher Standards?

  12. “M” Data-Based Decision Making and Missouri Teacher Standards ✔ Standard #1: Content knowledge and perspectives aligned with appropriate instruction Standard #2: Student Learning, growth and development Standard #3: Implementing the curriculum Standard #4: Teaching for critical thinking Standard #5: Creating a positive classroom learning environment Standard #6: Effective Communication Standard #7: Use of student assessment data to analyze and modify instruction Standard #8: Reflection on professional practice to assess effect of choices and actions on others Standard #9: Professional collaboration ✔ ✔ ✔ PLEASE animate so that check marks appear upon click as talking points…this could be an activity if desired! ✔ ✔ ✔ ✔ ✔

  13. Overview of Data-Based Decision Making

  14. Why use Data-Based Decision making? “M” Using a DBDM process shifts the work of school leadership teams from a reactive or crisis driven process to a pro-active, outcomes driven process, and sets the stage for continuous improvement. ~Gilbert, 1978; McIntosh, Horner & Sugai, 2009

  15. Why use Data-Based Decision making? School personnel have an opportunity to grow as consumers of data who can transform data reports (e.g. graphs or charts) into meaningful information that drives effective data-based decision making for organizational change and school improvement. ~Gilbert, 1978

  16. What is Data-Based Decision making? Data-Based decision making (DBDM) involves small teams meeting regularly and using an explicit, data-driven structure to: • disaggregate data, • analyze student performance, • set incremental student learning goals, • engage in dialogue around explicit and deliberate classroom instruction, and • create a plan to monitor instruction and student learning. (MO SPDG 2013)

  17. Pre-Requisites for Effective DBDM • Leadership • Collaborative Culture • Structured and protected collaborative time • Consistent process for DBDM Cycles • Efficient Data Collection & Reporting Systems • Fidelity of implementation data • Research based instructional practices & strategies • Additional Student Data (e.g., gender, race/ethnicity, school /classroom attendance, etc.) • AND…

  18. Pre-Requisites for Effective DBDM Academics Behavior Core Academic Standards (Social Behavioral) Schoolwide behavioral expectations Individual Classroom behavioral expectations Minor Office Disciplinary Referral (ODR) Form Major Office Disciplinary Referral (ODR) Form Minor and Major ODR data • Curriculum Maps • Identify Standard Selected for Assessment • Unwrap Standard Selected for Assessment • Common Pre, Formative and Summative Assessments • Common Scoring Guides and Rubrics

  19. Components of the DBDM Process Monitor

  20. Academic DBDM Flow Chart Collect & Chart Data Analyze Data SMART Goals Instructional Decision Making Determine Results Indicators Monitor

  21. “M” DBDM: Step 1Collect & Chart Data

  22. Components of the DBDM Process Monitor

  23. Why Collect & Chart Data • data influences decisions that guide the instruction for adults and students (Hamilton, et.al., 2009; Horner, Sugai, Todd 2001; Means, Chen, DeBarger & Padilla 2011; Newton, Horner, Algozzine, Todd, & Algozzine, 2009). • charting data creates visuals that delineate current status in the classroom (Horner, Sugai, & Todd, 2001). • it leads to higher student achievement (Reeves, 2009)

  24. Collect & Chart Data: Terms to Know Common Formative Assessment (CFA) • An assessment typically created collaboratively by a team of teachers responsible for the same grade level or course. Common formative assessments are used frequently throughout the year to identify (1) individual students who need additional time and support for learning, (2) the teaching strategies most effective in helping students acquire the intended knowledge and skills, (3) curriculum concerns—areas in which students generally are having difficulty achieving the intended standard—and (4) improvement goals for individual teachers and the team. Scoring Guide/Rubric • A coherent set of criteria for students’ work that includes descriptions of levels of performance quality on the criteria

  25. Collect & Chart Data: Overview • Teacher administers Common Formative Assessment (CFA). • Teacher uses Scoring Guide to score CFA. • Teacher charts classroom CFA data & gives to team leader. • Team leader compiles group CFA data into chart(s)(grade level or team). • Team leader shares charted group data at DBDM meeting.

  26. DBDM Process • Teacher Administers CFA. • Teacher scores CFA. • Teacher charts data & turns in. • Team Leader develops chart. • Team Leader shares charted data. Monitor

  27. Collect & Chart Data: Teacher Chart “I”or “K1” and “Q”

  28. Collect & Chart Data: Team Chart “I” or K2” and “R”

  29. Collect & Chart Data Process

  30. Case Study: Pre-Assessment Individual Teacher Charting

  31. Case Study: Pre-Assessment Individual Teacher Charting • All teachers complete the DBDM chart given to them (either electronic or hard copy) for each student who participates in the CFA administration. • The teachers then submit the charted data to the individual whose role it is to collate the grade level or departmental data.

  32. Case Study: Pre-Assessment Team Charting

  33. Collect & Chart Data: Practice Profile “H”

  34. Implementation Fidelity

  35. “H” Collect & Chart Data: Next Steps Using the results from the DBDM Practice Profile dialog to: • Assess your team/building current knowledge and implementation fluency with Collect & Chart Data • Determine possible next steps: • Decide what format will your team/building utilize (electronic or hard copy). • Plan for hands on training so that all teachers now how to chart their student data. • Establish who will collate the team data, & consider if they will need training as well. • Establish dates for submitting and for sharing collated data. • Identify specific ways your team will want/need data to be disaggregated.

  36. Next Steps: Action=Results What steps will you take to start implementing?

  37. “M” DBDM Step 2:Analyze & Prioritize

  38. DBDM Process Monitor

  39. Why Analyze & Prioritize The failure to achieve meaningful outcomes during school improvement activities is often due to a poor match between problems and the intensity, fidelity, or focus of interventions that are required. ~Sprague, et.al., 2001

  40. Analyze & Prioritize: Terms to Know • Decision Rules: clear, specific guidelines for making data-driven decisions (e.g., at least 80% of students should be meeting academic benchmarks) • Inference: generate possible explanations to derive accurate meaning from performance data

  41. Analyze & Prioritize:Overview • Team uses student work to observeand identify strengthsand obstacles (errors and misconceptions) as well as trends and patterns • Team develops inferencesbased on data • What is present becomes strengths • What is missing becomes obstacles or challenges • Team prioritizesby focusing on the most urgent needs of learners

  42. Analyze & Prioritize: Observations Examine student work that is proficient and higher and list • Strengths • Consistent skills • Trends Examine student work that is not proficient and list • Strengths and obstacles • Students consistently rated not proficient • Error Analysis • Inconsistent skills • Misconceptions in thinking • Trends • Trends related to certain subgroups(e.g., ELL, gender, race/ethnicity, school attendance, attendance in classrooms, engagement, etc.)

  43. Analyze & Prioritize:Inferences • For each subgroup of students (Proficient and Higher, Close to Proficient, Far to Go, and Intervention) infer what each listed performance strength means. (i.e., cause for celebration) • For students in Close to Proficient, Far to Go, and Intervention subgroupsinfer what each listed performance strength or obstaclemeans

  44. Analyze & Prioritize:Prioritization • For students in Proficient and Higher subgroups prioritize what might be a logical Next Step for further instruction to enhance student knowledge and use of the prioritized standard. • For students in the Close to Proficient, Far to Go, and Intervention subgroups prioritize which of the performance strengths or obstacles should be the logical Next Step for student instruction and support to develop and solidify student knowledge and use of the prioritized standard.

  45. Analyze & Prioritize: Behavioral Data • For each sub-group identify if each of the following apply: • Student attendance above 95% • Low percentage of classroom managed problem behaviors • Low percentage of student removal from academic instruction • If the answer is “YES” to all three conditions an inference can be made that: • Students are present at school • Students remain in the classroom for academic instruction

  46. Analyze & Prioritize: Behavioral Data • For each sub-group identify if each of the following apply: • Student attendance above 95% • Low percentage of classroom managed problem behaviors • Low percentage of student removal from academic instruction • If the answer is “NO” to any of the conditions the team needs to consider: • Which condition is not met? • Are universal effective classroom management practices in place with consistency and intensity needed to meet the foundational behavioral support needs of the students under scrutiny?

  47. Analyze & Prioritize Case StudyProficient & Higher Sub-group “J” or “L”

  48. Analyze & Prioritize Case StudyClose to Proficient Sub-group “J” or “L”

More Related