May 2011
Download
1 / 31

Measures of Effective Teaching (MET) project - PowerPoint PPT Presentation


  • 144 Views
  • Uploaded on

May 2011. Measures of Effective Teaching (MET) project. close the teaching gap. information you can trust. Trustworthiness Tests. 1. Face Validity Do teachers recognize the observation instrument and other measures as reflecting qualities of practice they value? 2. Coherence

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' Measures of Effective Teaching (MET) project' - avian


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
May 2011

May 2011

Measures of Effective Teaching (MET)project




Trustworthiness tests
Trustworthiness Tests

  • 1. Face Validity

    • Do teachers recognize the observation instrument and other measures as reflecting qualities of practice they value?

  • 2. Coherence

    • Do the measures match your district’s theory of instruction?

  • 3. Predictive Validity

    • Do scores on your measures correlate with outcomes that you value, such as gains in student learning?

  • 4. Scoring Reliability

    • If a different rater had been assigned to the observation or the assessment, would the score be the same?




The measures of effective teaching project
The Measures of Effective Teaching Project

Participating Teachers

  • Two school years: 2009–10 and 2010–11

  • >100,000 students

  • Grades 4–8: ELA and Math

  • High School: ELA I, Algebra I and Biology


Research Partners

  • Our primary collaborator include:

  • Mark Atkinson, Teachscape

  • Nancy Caldwell, Westat

  • Ron Ferguson, Harvard University

  • Drew Gitomer, Educational Testing Service

  • Eric Hirsch, New Teacher Center

  • Dan McCaffrey, RAND

  • Roy Pea, Stanford University

  • Geoffrey Phelps, Educational Testing Service

  • Rob Ramsdell, Cambridge Education

  • Doug Staiger, Dartmouth College

  • Other key contributors include:

  • Joan Auchter, National Board for Professional Teaching Standards

  • Charlotte Danielson, The Danielson Group

  • Pam Grossman, Stanford University

  • Bridget Hamre, University of Virginia

  • Heather Hill, Harvard University

  • Sabrina Laine, American Institutes for Research

  • Catherine McClellan, Educational Testing Service

  • Denis Newman, Empirical Education

  • Raymond Pecheone, Stanford University

  • Robert Pianta, University of Virginia

  • Morgan Polikoff, University of Southern California

  • Steve Raudenbush, University of Chicago

  • John Winn, National Math and Science Initiative

The MET ProjectThe Bill & Melinda Gates Foundation launched the Measures of Effective Teaching (MET) project in fall 2009 to test new approaches to measuring effective teaching. The project’s goal is to help build fair and reliable systems for teacher observation and feedback to help teachers improve and administrators make better personnel decisions. With funding from the foundation, the data collection and analysis are being led by researchers from academic institutions, nonprofit organizations, and several private firms and are being carried out in seven urban school districts.



Core principles
Core Principles

  • Outcomes matter: Whenever feasible, a teacher’s evaluation should include his or her students’ achievement gains.

  • Alignment matters: Any additional components of the evaluation (e.g., classroom observations, student feedback) should be demonstrably related to student achievement gains.

  • Feedback is the most important component of evaluation: It should point to specific strengths and weaknesses intended to improve practice.



Value added on each test
Value-added on each test

  • Value-added on state assessment

  • Value-added on supplemental assessments




Classroom Observation

Using Digital Video



Validation engine
Validation Engine

  • Software provides analysis of:

  • Rater consistency

  • Rubric’s relation to student learning

Raters score MET videos of instruction

System picks observation rubric & trains raters


Preliminary finding 1
Preliminary Finding #1

the kids know


Student perceptions
Student Perceptions

Consolidate

Confer

Captivate

Care

Challenge

Control

TestPrep

Clarify

Care

  • I have learned a lot this year about [the state test]

  • Getting ready for [the state ] test takes a lot of time in our class

  • If you don’t understand something , my teacher explains it a different way.

  • My teacher knows when the class understands, and when we do not.

  • My teacher has several good ways to explain each topic that we cover in the class.

  • My teacher makes me feel that s/he really cares about me

  • My teacher seems to know if something is bothering me

  • My teacher really tries to understand how students feel about things

  • Students in this class treat the teacher with respect

  • My classmates behave the way the teacher wants them to

  • Our class stays busy and doesn’t waste time

  • My teacher makes learning enjoyable

  • My teacher makes learning interesting

  • I like the way we learn in this class

  • My teacher wants us to share our thoughts

  • Students get to decide how activities are done in this class

  • My teacher takes the time to summarize what we learn each day

  • The comments that I get on my work in this class help me understand how to improve

  • My teacher asks students to explain more about the answers they give.

  • My teacher doesn’t let people give up when the work gets hard.

  • In this class, we learn to correct our mistakes.

Control

Clarify

Challenge

Captivate

Confer

Consolidate

TestPrep


Students Distinguish Between TeachersPercentage of Students by Classroom Agreeing


Student perceptions1
Student Perceptions

  • Top 5 Correlations

Category

Rank

Survey Statement

  • Students in this class treat the teacher with respect

Control

1

  • My classmates behave the way my teacher wants them to

Control

2

3

  • Our class stays busy and doesn’t waste time

Control

4

  • In this class, we learn a lot every day

Challenge

5

  • In this class, we learn to correct our mistakes

Challenge

Test Prep

  • I have learned a lot this year about [the state test]

33

Test Prep

34

  • Getting ready for [the state test] takes a lot of time in our class


Preliminary finding 2
Preliminary Finding #2

the consequences are great





Random assignment of classes to teachers year 2
Random Assignment of Classes to Teachers (Year 2)

This is the gold standard to ensure fair measures.

Teachers assigned to grade/course as usual.

Students assigned to classes as usual

Sections – the roster of students for a given course – assigned to teachers by lottery

This will require no action on the part of teachers

Nothing changes for how teachers are assigned to courses.

Nothing changes for how students register for courses and are grouped into classes by principals.

27



Video validation
Video validation

Developer

Instrument

University of Virginia

  • Classroom Assessment Scoring System, CLASS

  • Mathematical Quality of Instruction (MQI)

University of Michigan

  • Framework for Teaching

Charlotte Danielson

  • Quality Science Teaching (QST)

Stanford University

  • Protocol for Language Arts Teaching Observation (PLAT0)

Pam Grossman

  • National Board for Professional Teaching Standards

NBPTS

Natl Math & Sci Initiative

  • UTeach Observation Protocol (UTOP)



ad