data based decision making n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Data-Based Decision Making PowerPoint Presentation
Download Presentation
Data-Based Decision Making

Loading in 2 Seconds...

play fullscreen
1 / 128

Data-Based Decision Making - PowerPoint PPT Presentation


  • 157 Views
  • Uploaded on

Data-Based Decision Making. Openings & Introductions. Session-at-a-glance Introductions Training Norms Learner Objectives Pre-Session Readings and Essential Q uestions. Session-At-A-Glance. Overview of the data-based decision making (DBDM) (30 minutes)

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Data-Based Decision Making' - rosine


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
openings introductions
Openings & Introductions
  • Session-at-a-glance
  • Introductions
  • Training Norms
  • Learner Objectives
  • Pre-Session Readings and Essential Questions
session at a glance
Session-At-A-Glance
  • Overview of the data-based decision making (DBDM) (30 minutes)
  • 6-steps of the DBDM process (4 hours)
    • Why?
    • What?
    • Implementation fidelity indicators for each step
    • Team action planning for each step
  • Action Planning for entire DBDM process (30 minutes)
introductions
Introductions
  • Consultant add information needed regarding introductions for your session
  • Select and use an inclusion activity at consultant discretion
training norms
Training Norms
  • Begin and end on time
  • Be an engaged participant
  • Be an active listener—open to new ideas
  • Use notes for side bar conversations
  • Use electronics respectfully
learner outcomes
Learner Outcomes
  • Teacherlearns how data-based decision making allows for demonstration of Missouri Teacher Standards
  • Teacher develops knowledge and applies steps of DBDM “Cycles” (Data Teams) with example data sets:
    • Develop classroom system of data collection and charting
    • Analyze and disaggregate student learning
    • Establish student goals based on results
    • Select instructional practices
    • Determine results indicators (cause) and product (effect / social emotional and behavioral)
    • Design ongoing monitoring of results (monitor, reflect, adjust, repeat)
      • Review results indicators
      • Review implementation:
        • Instructional practices
        • Data cycle
learner post objectives
Learner “Post” Objectives

3. Teacher utilizes steps of DBDM “Cycles” with their classroom data

  • Teacher will collect, chart, analyze and disaggregate student learning data as well as implementation data
  • Teacher will explain results indicators for process (cause) and product (effect)
  • Teacher will design ongoing monitoring of results (monitor, reflect, adjust, repeat)
preparatory reading reflection
Preparatory Reading Reflection

“C”

Using Student Data to Support Instructional Decision Making

  • Review the 5 recommendations in the IES Practice Guide Summary
  • Mark with a star which of those recommendations and specific steps, with support, you as a classroom teacher can work to implement into your professional practice
  • When directed share your starred items with a shoulder partner
preparatory reading reflection1

“D”

Preparatory Reading Reflection

First Things First: Demystifying Data Analysis

Mike Schmoker poses 2 essential questions for educators to answer:

    • How many students are succeeding in the subject I teach?
    • Within those subjects, what are the areas of strengths and weakness?
  • How do you or your grade level or departmental team answer these questions now?
  • How can the answers to these questions efficiently drive instructional decision making at the classroom, grade level and/or departmental level?
essential questions
Essential Questions
  • How many students are succeeding in the subject I/we teach?
  • Within those subjects, what are the areas of strengths and weakness?
  • How can I/we establish and sustain a culture and process for strategic instructional decision making across our building, teams and classrooms?
connecting the dots when you are feeling overwhelmed
“Connecting the dots” when you are feeling overwhelmed!

How does data-based decision making allow teachers to simultaneously improve student outcomes while also demonstrating knowledge and fluency with Missouri Teacher Standards?

data based decision making and missouri teacher standards

“M”

Data-Based Decision Making and Missouri Teacher Standards

Standard #1: Content knowledge and perspectives aligned with appropriate instruction

Standard #2: Student Learning, growth and development

Standard #3: Implementing the curriculum

Standard #4: Teaching for critical thinking

Standard #5: Creating a positive classroom learning environment

Standard #6: Effective Communication

Standard #7: Use of student assessment data to analyze and modify instruction

Standard #8: Reflection on professional practice to assess effect of choices and actions on others

Standard #9: Professional collaboration

PLEASE animate so that check marks appear upon click as talking points…this could be an activity if desired!

why use data based decision making
Why use Data-Based Decision making?

“M”

Using a DBDM process shifts the work of school leadership teams from a reactive or crisis driven process to a pro-active, outcomes driven process, and sets the stage for continuous improvement.

~Gilbert, 1978; McIntosh, Horner & Sugai, 2009

why use data based decision making1
Why use Data-Based Decision making?

School personnel have an opportunity to grow as consumers of data who can transform data reports (e.g. graphs or charts) into meaningful information that drives effective data-based decision making for organizational change and school improvement.

~Gilbert, 1978

what is data based decision making
What is Data-Based Decision making?

Data-Based decision making (DBDM) involves small teams meeting regularly and using an explicit, data-driven structure to:

  • disaggregate data,
  • analyze student performance,
  • set incremental student learning goals,
  • engage in dialogue around explicit and deliberate classroom instruction, and
  • create a plan to monitor instruction and student learning.

(MO SPDG 2013)

pre requisites for effective dbdm
Pre-Requisites for Effective DBDM
  • Leadership
  • Collaborative Culture
  • Structured and protected collaborative time
  • Consistent process for DBDM Cycles
  • Efficient Data Collection & Reporting Systems
  • Fidelity of implementation data
  • Research based instructional practices & strategies
  • Additional Student Data (e.g., gender, race/ethnicity, school /classroom attendance, etc.)
  • AND…
pre requisites for effective dbdm1
Pre-Requisites for Effective DBDM

Academics

Behavior

Core Academic Standards (Social Behavioral)

Schoolwide behavioral expectations

Individual Classroom behavioral expectations

Minor Office Disciplinary Referral (ODR) Form

Major Office Disciplinary Referral (ODR) Form

Minor and Major ODR data

  • Curriculum Maps
  • Identify Standard Selected for Assessment
  • Unwrap Standard Selected for Assessment
  • Common Pre, Formative and Summative Assessments
  • Common Scoring Guides and Rubrics
slide22

Academic DBDM Flow Chart

Collect & Chart Data

Analyze Data

SMART Goals

Instructional Decision Making

Determine Results Indicators

Monitor

why collect chart data
Why Collect & Chart Data
  • data influences decisions that guide the instruction for adults and students (Hamilton, et.al., 2009; Horner, Sugai, Todd 2001; Means, Chen, DeBarger & Padilla 2011; Newton, Horner, Algozzine, Todd, & Algozzine, 2009).
  • charting data creates visuals that delineate current status in the classroom (Horner, Sugai, & Todd, 2001).
  • it leads to higher student achievement (Reeves, 2009)
collect chart data terms to know
Collect & Chart Data: Terms to Know

Common Formative Assessment (CFA)

  • An assessment typically created collaboratively by a team of teachers responsible for the same grade level or course. Common formative assessments are used frequently throughout the year to identify (1) individual students who need additional time and support for learning, (2) the teaching strategies most effective in helping students acquire the intended knowledge and skills, (3) curriculum concerns—areas in which students generally are having difficulty achieving the intended standard—and (4) improvement goals for individual teachers and the team.

Scoring Guide/Rubric

  • A coherent set of criteria for students’ work that includes descriptions of levels of performance quality on the criteria
collect chart data overview
Collect & Chart Data: Overview
  • Teacher administers Common Formative Assessment (CFA).
  • Teacher uses Scoring Guide to score CFA.
  • Teacher charts classroom CFA data & gives to team leader.
  • Team leader compiles group CFA data into chart(s)(grade level or team).
  • Team leader shares charted group data at DBDM meeting.
dbdm process
DBDM Process
  • Teacher Administers CFA.
  • Teacher scores CFA.
  • Teacher charts data & turns in.
  • Team Leader develops chart.
  • Team Leader shares charted data.

Monitor

collect chart data teacher chart
Collect & Chart Data: Teacher Chart

“I”or “K1” and “Q”

collect chart data team chart
Collect & Chart Data: Team Chart

“I” or K2” and “R”

case study pre assessment individual teacher charting1
Case Study: Pre-Assessment Individual Teacher Charting
  • All teachers complete the DBDM chart given to them (either electronic or hard copy) for each student who participates in the CFA administration.
  • The teachers then submit the charted data to the individual whose role it is to collate the grade level or departmental data.
collect chart data next steps

“H”

Collect & Chart Data: Next Steps

Using the results from the DBDM Practice Profile dialog to:

  • Assess your team/building current knowledge and implementation fluency with Collect & Chart Data
  • Determine possible next steps:
    • Decide what format will your team/building utilize (electronic or hard copy).
    • Plan for hands on training so that all teachers now how to chart their student data.
    • Establish who will collate the team data, & consider if they will need training as well.
    • Establish dates for submitting and for sharing collated data.
    • Identify specific ways your team will want/need data to be disaggregated.
next steps action results
Next Steps: Action=Results

What steps will you take to start implementing?

why analyze prioritize
Why Analyze & Prioritize

The failure to achieve meaningful outcomes during school improvement activities is often due to a poor match between problems and the intensity, fidelity, or focus of interventions that are required.

~Sprague, et.al., 2001

analyze prioritize terms to know
Analyze & Prioritize: Terms to Know
  • Decision Rules: clear, specific guidelines for making data-driven decisions (e.g., at least 80% of students should be meeting academic benchmarks)
  • Inference: generate possible explanations to derive accurate meaning from performance data
analyze prioritize overview
Analyze & Prioritize:Overview
  • Team uses student work to observeand identify strengthsand obstacles (errors and misconceptions) as well as trends and patterns
  • Team develops inferencesbased on data
    • What is present becomes strengths
    • What is missing becomes obstacles or challenges
  • Team prioritizesby focusing on the most urgent needs of learners
analyze prioritize observations
Analyze & Prioritize: Observations

Examine student work that is proficient and higher and list

  • Strengths
  • Consistent skills
  • Trends

Examine student work that is not proficient and list

  • Strengths and obstacles
  • Students consistently rated not proficient
  • Error Analysis
    • Inconsistent skills
    • Misconceptions in thinking
  • Trends
  • Trends related to certain subgroups(e.g., ELL, gender, race/ethnicity, school attendance, attendance in classrooms, engagement, etc.)
analyze prioritize inferences
Analyze & Prioritize:Inferences
  • For each subgroup of students (Proficient and Higher, Close to Proficient, Far to Go, and Intervention) infer what each listed performance strength means. (i.e., cause for celebration)
  • For students in Close to Proficient, Far to Go, and Intervention subgroupsinfer what each listed performance strength or obstaclemeans
analyze prioritize prioritization
Analyze & Prioritize:Prioritization
  • For students in Proficient and Higher subgroups prioritize what might be a logical Next Step for further instruction to enhance student knowledge and use of the prioritized standard.
  • For students in the Close to Proficient, Far to Go, and Intervention subgroups prioritize which of the performance strengths or obstacles should be the logical Next Step for student instruction and support to develop and solidify student knowledge and use of the prioritized standard.
analyze prioritize behavioral data
Analyze & Prioritize: Behavioral Data
  • For each sub-group identify if each of the following apply:
    • Student attendance above 95%
    • Low percentage of classroom managed problem behaviors
    • Low percentage of student removal from academic instruction
  • If the answer is “YES” to all three conditions an inference can be made that:
    • Students are present at school
    • Students remain in the classroom for academic instruction
analyze prioritize behavioral data1
Analyze & Prioritize: Behavioral Data
  • For each sub-group identify if each of the following apply:
    • Student attendance above 95%
    • Low percentage of classroom managed problem behaviors
    • Low percentage of student removal from academic instruction
  • If the answer is “NO” to any of the conditions the team needs to consider:
    • Which condition is not met?
    • Are universal effective classroom management practices in place with consistency and intensity needed to meet the foundational behavioral support needs of the students under scrutiny?
analyze prioritize practice profile
Analyze & PrioritizePractice Profile

“H”

PLEASE Insert the finalized / approved Practice Profile Pieces for this step

analyze prioritize next steps
Analyze & Prioritize: Next Steps

“H”

Using the results from the DBDM Practice Profile

dialog to:

  • Assess your team/building current knowledge and implementation fluency with Analyze & Prioritize
  • Determine possible next steps:
    • Identify academic strengths for each student sub-group
    • Identify academic obstacles for each sub-group
    • Identify behavioral strengths and/or obstacles for each sub-group
    • Develop possible next instructional (academic or behavioral) for each sub-group that directly connects to inferences made for each sub-group
why develop a smart goal
Why Develop a SMART Goal

“According to research, goal setting is the single most powerful motivational tool in a teacher’s toolkit. Why? Because goal setting operates in ways that provide:

  • Purpose
  • Challenge
  • Meaning

Goals are the guideposts along the road that make a compelling vision come alive. Goals energize people.

Specific, clear, challenging goals lead to greater effort and achievement than easy or vague goals do.”

(Blanchard, 2007, p. 150)

slide58

The lack of clear goals may provide the most credible explanation for why we are still only inching along in our effort to improve schooling for U.S. children.~Mike Schmoker

smart goals terms to know
SMART Goals: Terms to Know
  • Specific: Says exactly who the learner is and what the learner will be able to do
  • Measureable: Objective definition such that the behavior can be observed and counted
  • Attainable: A skill that learners can master within the given period of time
  • Results-Oriented: Must be something learners can do to demonstrate growth; relevant to the learner
  • Time Bound: Achievable by time frame set
  • Says exactly what the learner will be able to do
  • Objective definition such that the behavior can be observed and counted
  • A skill that learners can master within the given period of time
  • Must be something learners can do to demonstrate growth; relevant to the learner
  • Achievable by time frame set
measurable goals
Measurable Goals

“Clear, measurable goals are at the center of

the mystery of a school’s success, mediocrity

or failure.”

~ S. J. Rosenholz

for smart goals to make a difference to teachers
For SMART Goals to make a difference to teachers…
  • Teachers have to be engaged in the process of developing the goal so they own the goal.
  • Teachers have to look at the data and design a goal that make sense to them.
  • The goal becomes powerful when teachers use it to inform their practice.
slide62

SMART Goal: Overview

The percentage of (Name Student Group)scoring proficient or higher in (Name the Content Area) will increase from (Current Status Percentage) to (Goal Percentage) by the end of (Month, Quarter or Date) as measured by (Assessment Tool) administered on (Specific Date).

smart goal example
SMART Goal: Example
  • First grade students enrolled in the District and on IEPs scoring proficient or higher in reading comprehensionwill increase from 45 % to 60% by the end of the third quarter grading period asmeasured bya teacher-made 10-questioncomprehension test administered two days prior to the end of the third quarter.
slide64

“J” or “L”

SMART Goal Case Study

slide65
Setting a Goal: Example Pre-Instructional CFA Data (Assessment of student knowledge of priority standard BEFORE instruction)

Currently there are 97 students in the group analyzed

17 students (# of proficient or higher)(= 17.5% of student whose learning the team will want to maintain and enrich)

24 students (# of close to proficiency) (= 24.7% of students whose learning may be most readily moved towards proficiency with instruction)

41 students (# most likely to meet proficiency goals at unit end) (= 42% of all students projected to most likely meet proficiency goals)

If the team has a goal of 83% of students at proficiency, at a minimum, then the instruction will need to be designed and implemented to accelerate the learning of

40 additional students (# far to go but likely to make it) (= 41%)

81 / 97 = 83% proficient

smart goal implementation example
SMART Goal: Implementation Example

4thgrade students

The percentage of _____

scoring proficient or higher in____________

will increase from ___% to __% by the end of

________ as measured by

administered on______

Comm. Arts

17

88

Sixweeks

Common Formative Assessments

Date

smart goal practice profile

“H”

SMART GoalPractice Profile

PLEASE Insert the finalized / approved Practice Profile Pieces for this step

smart goal next steps

“H”

SMART Goal: Next Steps

Using the results from the DBDM Practice Profile dialog to

  • Assess your team/building current knowledge and implementation fluency with SMART Goals
  • Determine possible next steps:
    • Identify how many students are in each group and by IEP status (Step 1)
    • Calculate percentage for how many students are in each group by IEP status (Step 2)
    • Develop an ambitious, yet achievable goal for the percentage of students who can with strategic instruction meet the criterion score on the Common Formative Assessment written for the Priority Learning Target under analysis (Step 3)
effective teaching learning practices strategies
Effective Teaching/LearningPractices & Strategies

Instructional Practices:

  • Effective teaching/learning practices at the classroom level are research based effective methods that are not content related and when practiced regularly and with fidelity improve the teaching and learning in all content areas through direct application or through transfer of knowledge and skill.

Instructional Strategies:

  • Effective teaching/learning strategiesat the classroom level are actions that are content related and used to help improve a particular step or steps within a content standard-they are discreet.
revised instructional assessment model with data analysis
Revised Instructional Assessment Model with Data Analysis
  • Schoolwide Selection, Professional Development and Implementation of an Effective Teaching and Learning Practice (ETLP)
effective teaching learning practices
Effective Teaching/Learning Practices

The MO SPDG Collaborative Work has selected from the meta analysis work of John Hattie 4 ETLP and developed training packets for school use:

  • Assessment Capable Learners
  • Feedback
  • Spaced vs. Massed
  • Reciprocal Teaching

Each school will select 1 ETLP for schoolwide professional learning and implementation

instructional decision making practice profile

“H”

Instructional Decision MakingPractice Profile

PLEASE Insert the finalized / approved Practice Profile Pieces for this step

effective teaching learning practices next steps

“H”

Effective Teaching/ Learning Practices: Next Steps

Using the results from the DBDM Practice Profile

dialog to:

  • Assess your team/building current knowledge and implementation fluency with Effective Teaching and Learning Practices
  • Determine possible next steps:
    • Identify an ETLP for schoolwide implementation
    • Identify instructional strategies that are proven effective for the academic domain of the Priority Learning Target under consideration
    • Match the prioritized instructional next steps for each sub-group with the appropriate instructional practices or strategies
why results indicators
Why Results Indicators
  • They allow us to monitor progress of:
    • implementation of our strategies/practices
    • effectiveness of our strategies/practices
  • They facilitate the planning for sustaining or revising of our strategies/practices
results indicators terms to know
Results Indicators: Terms to Know
  • Cause Data
    • Data that measures adult behaviors
  • Effect Data
    • Data that measures student outcomes
  • “Look fors”
    • Indicators in student work which demonstrate change in proficiency
results indicators overview
Results Indicators: Overview
  • Results Indicators include
    • Adult behavior (Cause)
    • Student behavior (Effect)
    • “Look fors” in student work (Effect)
  • Articulated for each instructional group
  • Directly linked to prioritized needs and strategies/practices selected
  • Specific and clear enough to allow for
    • prediction of student outcomes prior to next assessment
    • replication of practice
cause effect activity
Cause – Effect Activity

“O”

Cause

Effect

% of students passing the formative quiz given on Friday

Results on the fluency screening assessment in January

% of students who have 3-5 office referrals for the year

Scores on the math chapter test

  • % of teachers teaching the social skills lessons on a weekly basis
  • Using the supplemental questions to practice the format of the test
  • # of teachers adhering to the allocated instructional minutes for literacy
  • # of teachers using bell to bell activities to review the science objectives
results indicators process
Results Indicators: Process
  • Identify a prioritized need for each instructional group and select an evidence based practice or strategy
  • Develop descriptors of what should be observable if adults implement the practice or strategy with fidelity
  • Develop descriptors of what should be observable in student behavior if the adults implement the practice or strategy with fidelity
  • Develop descriptors of what should be observable in the student work if the practice or strategy is implemented effectively
  • Establish a cause/effect relationship between the practice or strategy and the results
results indictors implementation example
Results Indictors:Implementation Example

Results indicators complete the sentence:

  • “If teachers do _____________, thenstudents will _________________.”
results indicators implementation example
Results Indicators: Implementation Example
  • If a teachermodels using a comparison matrix in a mini lesson, then the students will extend the comparison matrix as they read further in the text, which demonstrates evaluation level thinking.
  • “Look fors”: students will be able to describe how different words effect them as readers in the comparison matrix.
results indicators implementation example1
Results Indicators: Implementation Example
  • If a teacher models syllable patterns, blend patterns and word chunks daily during guided reading, then students will apply strategies to their to their leveled guided reading text.
  • “Look fors”: Students will read more fluently and with greater comprehension.
results indicators implementation example2
Results Indicators: Implementation Example
  • If a teacher models using whole group behavior expectations during a mini lesson for classroom behaviors, then the students follow whole group behavior expectations duringwhole group instruction.
  • “Look fors”: students will be able to
    • Stay in personal space
    • Keep all hands feet and objects to self
    • Respond appropriately to thoughts of others
    • Raise a hand to indicate they have something to say
results indicators practice profile

“H”

Results Indicators: Practice Profile

PLEASE Insert the finalized / approved Practice Profile Pieces for this step

results indicators next steps

“H”

Results Indicators: Next Steps

Using the results from the DBDM Practice Profile

dialog to:

  • Assess your team/building current knowledge and implementation fluency with Results Indicators
  • Determine possible next steps:
    • Identify teacher behaviors for implementation fidelity of practice/strategy selected
    • Identify student behaviors that demonstrate knowledge or application of Priority Learning Target
    • Identify “look fors” in student work that will demonstrate knowledge and ability to apply Priority Learning Target11
slide101

Monitoring is an ongoing process by educators throughout the entire data based decision making cycle in which student performance (effect data) and adult behaviors (cause data) are observed, measured, and recorded to make decisions about progress, success, challenges, and provide feedback regarding next steps.

why monitor
Why Monitor
  • To engage in a continuous improvement cycle.
    • Monitoring allows educators to reflect on their professional practice.
    • Monitoring allows for mid-course corrections.
    • Monitoring allows for short term wins.
    • Through lessons learned, monitoring leads to next steps.
    • Monitoring ensures fidelity of implementation.
    • Monitoring must consider both cause and effect data.
monitor process
Monitor: Process

Cause

Effect

Teachers examine student work samples to provide evidence of implementation of the practice and to determine its impact.

Teachers discuss the effectiveness of the practice (continue, modify, or stop).

  • Teachers administer CFA with fidelity.
  • Teachers collect and chart data appropriately.
  • Teachers analyze and prioritize results of CFA.
  • Teachers develop a S.M.A.R.T. Goal.
  • Teachers determine the Effective Teaching and Learning Practice.
  • Teachers support each other in the use of the practice.
  • Teachers describe the implementation of the practice (frequency, effectiveness, feedback, celebrations/ challenges).
monitoring components
Monitoring Components
  • Monitoring Process & Practices
    • Sources of Data to Monitor
    • Individual(s) Responsible
    • Timeline
  • Evaluate the DBDM Process
    • We Planned These
    • We Achieved These
  • Apply what was Learned
    • We Learned
    • We will Replicate
monitor practice profile

“H”

Monitor: Practice Profile

PLEASE Insert the finalized / approved Practice Profile Pieces for this step

monitoring next steps

“H”

Monitoring: Next Steps

Using the results from the DBDM Practice Profile

dialog to:

  • Assess your team/building current knowledge and implementation fluency with Monitoring
  • Determine possible next steps:
    • Establish standard reflection questions for each team to use as they monitor
      • Implementation of the DBDM process
      • Teacher and student outcomes as a result of DBDM process
    • Establish timelines for team sharing to schoolwide leadership team
coming full circle

Coming Full Circle

Revisiting DBDM Essential Questions

Developing Action Steps

Embedding DBDM into Professional Practice

essential questions1
Essential Questions
  • How many students are succeeding in the subject I/we teach?
  • Within those subjects, what are the areas of strengths and weakness?
  • How can I/we establish and sustain a culture and process for strategic instructional decision making across our building, teams and classrooms?
practice profile to action steps implementation drivers

?

Practice Profile to Action Steps(Implementation Drivers)

Review your scoring on the Practice Profile for the DBDM 6-steps:

  • Where are there strategic opportunities for your team/faculty to implement action steps that can move your process towards DBDM forward efficiently and effectively?
  • What are your teams/faculty goals for DBDM during the current school year?
  • What job embedded professional learning will need to take place once your team/faculty returns to your building?
practice profile to action steps implementation barriers

?

Practice Profile to Action Steps(Implementation Barriers)

Review your scoring on the Practice Profile for the DBDM 6-steps:

  • What action steps will your team/faculty need to implement within Step 6 – Monitoring to increase the likelihood of fidelity of implementation?
  • How will your team/faculty respond to resistant team/faculty members?
putting dbdm into professional practice

Putting DBDM into Professional Practice

How to make DBDM the way your building/team does instructional decision making on a consistent basis.

sample dbdm team schedule
Sample DBDM Team Schedule

* Possible time frame to complete one full cycle of PLC/Data Teams process:

1 month for DBDM Teams who meet once per week

2 months for DBDM Teams who meet twice a month

dbdm placemat
DBDM “Placemat”

Student Achievement Goal:

Celebration when goal is achieved:

slide117

Strategic process for improved instructional data-based decision making leading to increased student achievement

pre instruction assessment and mid assessment

Pre-Instruction Assessment and Mid-Assessment

Intention: Students growing in proficiency with Prioritized Learning Targets through strategic instructional decision making.

slide120

Green indicates movement up, Red indicates movement down and BLACK indicates some students moved up and some moved down into the group

pre instruction assessment to post assessment

Pre-Instruction Assessment to Post-Assessment

Intention: Students growing in proficiency with Prioritized Learning Targets through strategic instructional decision making

practice profile

“H”

Practice Profile

PLEASE Insert the finalized / approved Practice Profile Pieces for this step

references
References
  • Blanchard, K. (2007). Leading at a higher level: Blanchard on leading and higher performing organizations. Upper Saddle River, NJ: Prentice Hall.
  • Gilbert, T.F., (1978). Human competence: Engineering worthy performance. New York, NY: McGraw-Hill.
  • Hamilton, L., Halverson, R., Jackson, S., Mandinach, E., Supovitz, J., & Wayman, J. (2009). Using student achievement data to support instructional decision making (NCEE 2009-4067). Washington, DC: National Center for Education Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education. Retrieved from http://ies.ed.gov/ncee/wwc/publications/practiceguides/
  • Horner, R. H., Sugai, G., & Todd, A.W. (2001). “Data” need not be a four-letter word: Using data to improve schoolwide discipline. Beyond Behavior, 11(1), 20-22.
  • Means, B. Chen, E., DeBarger, A, & Padilla, C. (2011). Teachers’ ability to use data to inform instruction: Challenges and supports. U.S. Department of Education, Office of Planning, Evaluation and Policy Development, Washington, D.C., 2011. Retrieved from http://www2.ed.gov/about/offices/list/opepd/ppss/reports.html
  • McIntosh, K., Horner, R. H., & Sugai, G. (2009) Sustainability of systems-level evidence-based practices in schools: Current knowledge and future directions. In W. Sailor, G. Dunlap, G. Sugai, & R. Horner (Eds.) Handbook of positive behavior support (pp. 327-352). New York: Springer.
references1
References
  • Newton, S. J., Horner, R. H., Algozzine, R. F., Todd, A. W., & Algozzine, K. M. (2009). Using a problem-solving model to enhance data-based decision making in schools. In W. Sailor, G. Dunlap, G. Sugai & R. Horner (Eds.) Handbook of positive behavior support (pp. 551-580). New York, NY: Springer Science & Business Media, LLC.
  • Reeves, D. B., (2009). Leading change in your school: How to conquer myths, build commitment, and get results. Alexandria, VA: Association for Supervision and Curriculum Development:
  • Rosenholtz, S.J. (1991). Teacher’s workplace: the social organization of schools. New york: Teachers College Press.
  • Sprague, J., Walker, H., Golly, A., White, K., Myers, D.R., and Shannon, T. (2001). Translating research into effective practice: The effects of a universal staff and student intervention on indicators of discipline and school safety. Education & Treatment of Children, 24(4), 495-511.