Combining high quality observations with student surveys and achievement gains
Download
1 / 35

Gathering Feedback for Teaching - PowerPoint PPT Presentation


  • 116 Views
  • Uploaded on

Combining High-Quality Observations with Student Surveys and Achievement Gains. Gathering Feedback for Teaching. Closing the effectiveness gap. Progressing beyond “The Plateau”. Ensuring Reliable & Trustworthy Observations. The First of Three Questions.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' Gathering Feedback for Teaching' - greg


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Combining high quality observations with student surveys and achievement gains

Combining High-Quality Observations with

Student Surveys and Achievement Gains

Gathering Feedback for Teaching





The first of three questions
The First of Three Questions

How many distinct levels of teacher performance do you think an evaluation system should recognize?


The second of three questions
The Second of Three Questions

What performance levels would you assign to what fraction of your district’s teachers on the following competencies?

  • Classroom Management (time, behavior, materials)

  • Goals & Tasks (clear, appropriate, rigorous, interesting)

  • Supporting Student Understanding (content depth, feedback, questioning and discussion, instructional dialogue)


The third of three questions
The Third of Three Questions

How closely associated are teaching behaviors with student outcomes (academic growth over time)?

highly

moderately

weakly

Student Performance

Teacher’s Classroom Observation Score



Observation score distributions framework for teaching
Observation Score Distributions Learned MoreFramework for Teaching



Observation score distributions uteach observation protocol
Observation Score Distributions Learned MoreUTeach Observation Protocol


Measures have different strengths and weaknesses
Measures have different strengths Learned More …and weaknesses

H

M

L

M

H

M

M/H

L

H



Compared to ma degrees and years of experience the combined measure identifies larger differences
Compared to MA Degrees and Years of Experience, the Combined Measure Identifies Larger Differences

… on state tests

Compared to What?


And on low stakes assessments
…and on low stakes assessments the Combined Measure Identifies Larger Differences

Compared to What?


As well as on student reported outcomes
…as well as on student-reported outcomes. the Combined Measure Identifies Larger Differences

Compared to What?


The Measures of Effective Teaching Project the Combined Measure Identifies Larger Differences

Participating Teachers

  • Two school years: 2009–10 and 2010–11

  • >100,000 students

  • Grades 4–8: ELA and Math

  • High School: ELA I, Algebra I and Biology


Research Partners the Combined Measure Identifies Larger Differences

  • Our primary collaborators include:

  • Mark Atkinson, Teachscape

  • Nancy Caldwell, Westat

  • Ron Ferguson, Harvard University

  • Drew Gitomer, Educational Testing Service

  • Eric Hirsch, New Teacher Center

  • Dan McCaffrey, RAND

  • Roy Pea, Stanford University

  • Geoffrey Phelps, Educational Testing Service

  • Rob Ramsdell, Cambridge Education

  • Doug Staiger, Dartmouth College

  • Other key contributors include:

  • Joan Auchter, National Board for Professional Teaching Standards

  • Charlotte Danielson, The Danielson Group

  • Pam Grossman, Stanford University

  • Bridget Hamre, University of Virginia

  • Heather Hill, Harvard University

  • Sabrina Laine, American Institutes for Research

  • Catherine McClellan, Clowder Consulting

  • Denis Newman, Empirical Education

  • Raymond Pecheone, Stanford University

  • Robert Pianta, University of Virginia

  • Morgan Polikoff, University of Southern California

  • Steve Raudenbush, University of Chicago

  • John Winn, National Math and Science Initiative


What the Participants Said…. the Combined Measure Identifies Larger Differences

  • The MET Project is ultimately a research project. Nonetheless, participants frequently tell us they have grown professionally as a result of their involvement. Below is a sampling of comments we received.

  • From Teachers:

  • “The video-taping is what really drew me in, I wanted to see not only what I’m doing but what are my students doing. I thought I had a pretty good grasp of what I was doing as a teacher, but it is eye opening … I honestly felt like this is one of the best things that I have ever done to help me grow professionally. And my kids really benefited from it, so it was very exciting.”

  • "With the videos, you get to see yourself in a different way. Actually you never really get to see yourself until you see a video of yourself. I changed immediately certain things that I did that I didn't like.”

  • “I realized I learned more about who I actually was as a teacher by looking at the video. I learned of the things that I do that I think that I’m great at I was not so great at after all.”

  • “Even the things I did well, I thought, ok that's pretty good, why do I do that, and where could I put that to make it go farther. So it was a two-way road, seeing what you do well, and seeing the things that have become habits that you don't even think about anymore."

  • From Raters:

  • “Being a rater has been a positive experience for me.  I find myself ‘watching’ my own teaching more and am more aware of the things I should be doing more of in my classroom.”

  • “I have to say, that as a teacher, even the training has helped me refine my work in the classroom.  How wonderful!”

  • “I have loved observing teachers, reflecting on my own teaching and that of the teachers teaching in my school.”


Met extension a library of teaching practice
MET Extension: the Combined Measure Identifies Larger DifferencesA Library of Teaching Practice

  • Additional Data Collection

  • Subset of 360 MET Teachers (disproportionately highly effective)

  • 50 lessons taped (18,000 total lessons)

  • 100% teacher & parental consent (allowing for broader public use)

  • Cheaper cameras with potential for scale

  • Library of Practice

  • Searchable database (tagged by Common Core standards, teaching practices, etc.)

  • Tagging to be done in partnership with schools of education (w/ teachers-in-training)

  • Potential Uses:

  • Rater training, certification, and calibration

  • School districts – professional development

  • Teacher training institutions – teaching methods

  • Observation instrument developers – validation of new & existing tools


MET Logical Sequence the Combined Measure Identifies Larger Differences

Measures reliable

Measures predict

Research

Use

Measures stable under pressure

Measures fairly reflect teacher

?

Effective Teaching Index

Measures combine

Teaching Effectiveness Dashboard

Measures communicated effectively

Measures improve effectiveness


Validation Engine the Combined Measure Identifies Larger Differences

System picks observation rubric & trains raters

Raters score MET videos of instruction

  • Software provides analysis of:

  • Rater consistency

  • Rubric’s relation to student learning


Four Steps the Combined Measure Identifies Larger Differences

Four Steps to High-Quality

Classroom Observations


Step 1 define expectations framework for teaching danielson
Step 1: the Combined Measure Identifies Larger Differences Define ExpectationsFramework for Teaching (Danielson)

Four Steps


Step 2 ensure accuracy of observers
Step 2: the Combined Measure Identifies Larger Differences Ensure Accuracy of Observers

Four Steps


Step 3 monitor reliability
Step 3: the Combined Measure Identifies Larger Differences Monitor Reliability

Four Steps


Multiple Observations the Combined Measure Identifies Larger DifferencesLeads to Higher Reliability

Four Steps

NOTES: The numbers inside each circle are estimates of the percentage of total variance in FFT observation scores attributable to consistent aspects of teachers’ practice when one to four lessons were observed, each by a different observer. The total area of each circle represents the total variance in scores. These estimates are based on trained observers with no prior exposure to the teachers’ students, watching digital videos. Reliabilities will differ in practice. See the research paper, Table 11, for reliabilities of other instruments.


Students with most effective teachers learn more in school
Students with Most Effective Teachers Learn More in School the Combined Measure Identifies Larger Differences


Student perceptions
Student Perceptions the Combined Measure Identifies Larger Differences

Captivate

Consolidate

Challenge

Confer

Control

Clarify

Care

TestPrep

Care

  • My teacher makes me feel that s/he really cares about me

  • My teacher seems to know if something is bothering me

  • My teacher really tries to understand how students feel about things

  • My teacher takes the time to summarize what we learn each day

  • The comments that I get on my work in this class help me understand how to improve

  • If you don’t understand something , my teacher explains it a different way.

  • My teacher knows when the class understands, and when we do not.

  • My teacher has several good ways to explain each topic that we cover in the class.

  • My teacher asks students to explain more about the answers they give.

  • My teacher doesn’t let people give up when the work gets hard.

  • In this class, we learn to correct our mistakes.

  • Students in this class treat the teacher with respect

  • My classmates behave the way the teacher wants them to

  • Our class stays busy and doesn’t waste time

  • My teacher makes learning enjoyable

  • My teacher makes learning interesting

  • I like the way we learn in this class

  • My teacher wants us to share our thoughts

  • Students get to decide how activities are done in this class

  • I have learned a lot this year about [the state test]

  • Getting ready for [the state ] test takes a lot of time in our class

Control

Clarify

Challenge

Captivate

Confer

Consolidate

TestPrep


Student perceptions1
Student Perceptions the Combined Measure Identifies Larger Differences

  • Top 5 Correlations

Category

Rank

Survey Statement

  • Students in this class treat the teacher with respect

Control

1

  • My classmates behave the way my teacher wants them to

Control

2

3

  • Our class stays busy and doesn’t waste time

Control

4

  • In this class, we learn a lot every day

Challenge

5

  • In this class, we learn to correct our mistakes

Challenge

  • I have learned a lot this year about [the state test]

Test Prep

33

Test Prep

34

  • Getting ready for [the state test] takes a lot of time in our class


Combining measures improved reliability

The Reliability and Predictive Power of Measures of Teaching:

.25

VA alone

Combined

(Criterion Weights)

.2

Combined

(Equal Weights)

.15

Tripod alone

Difference in Math VA (Top 25% vs. Bottom 25%)

.1

FFT alone

.05

0

.1

.2

.3

.4

.5

.6

.7

Reliability

Note: Table 16 of the research report. Reliability based on one course section, 2 observations.

Combining Measures Improved Reliability

as well as Predictive Power

Dynamic Trio

Note: For the equally weighted combination, we assigned a weight of .33 to each of the three measures. The criterion weights were chosen to maximize ability to predict a teacher’s value-added with other students. The next MET report will explore different weighting schemes.


ad