1 / 72

Measuring Effective Teaching

UTOP Training Mary Walker University of Texas at Austin Candace Walkington Southern Methodist University. Measuring Effective Teaching. What does effective teaching look like and how can we measure it?

suzy
Download Presentation

Measuring Effective Teaching

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. UTOP TrainingMary WalkerUniversity of Texas at AustinCandace WalkingtonSouthern Methodist University

  2. Measuring Effective Teaching • What does effective teaching look like and how can we measure it? • Can classroom observers be trained to make key distinctions in effective teaching practices? • Can the skills involved with being an effective teacher be successfully trained through a teacher preparation program?

  3. Some Key Features of UTeach Philosophy Organized, well-managed, on-task classroom Attention to issues of diversity and access Incorporating inquiry/investigative learning Using technology for teaching and learning Fluid and accurate communication of content Fostering student-student collaboration Formative assessment of student progress Applications and inter-disciplinary connections Critical practices of self-reflection Facilitating classroom discussion and “student talk”

  4. The UTeach Program • Steady increase in number of students with strong STEM backgrounds going into teaching • Replicated at 28 universities in 13states • 92% of graduates go into teaching, 82% remain 5 years later (compared to 65%nationally) www.nationalmathandscience.org

  5. Background of Project • Persistent requests to evaluate UTeach Graduates • UTeach boosts recruitment and retention, but are UTeach graduates effective teachers? • Look towards classroom observation

  6. Observational protocol development • RTOP [Reformed Teaching Practices] • http://physicsed.buffalostate.edu/AZTEC/RTOP/RTOP_full/about_RTOP.html • COP [Classroom Observation Protocol] • www.horizon-research.com/instruments/lsc/cop.pdf • http://www.horizon-research.com/instruments/hri_instrument.php?inst_id=14

  7. Description of UTOP • Full version has 32 indicators (teaching behaviors) in 4 sections • Classroom Environment • Lesson Structure • Implementation • Mathematics/Science Content • 1-5 scale, DK/NA options • Section Synthesis Ratings

  8. UTOP and Online Manual

  9. UTOP and Online Manual This indicator assesses the degree to which students have learned to be collegial, respectful, cooperative, and interactive when working in groups. Evidence of collegial working relationships among students includes collaborative discussions about topics relevant to the lesson and successful distributing of roles and responsibilities within each group… This indicator should be rated a 1 if there is group work during the lesson, but the group work is highly unproductive. This could include behavior where the majority of the groups are socializing, off-task, arguing, or ignoring each other, as well as regular instances of students copying and/or certain group members doing all of the work. This indicator should be rated a 2 if … Rating of 3 Example: The students were put into debate groups for this class period - one group would debate another group, while the rest of the student groups were in the audience. The groups worked together smoothly - the students were able to pick who was doing what part of the debate, coordinate their arguments, and split the time slots when necessary. The audience also would occasionally compare their notes during breaks…

  10. Pilot Study • Test UTOP on some of our graduate’s classrooms • Conducted 83 observations of: • UTeach Graduates (N=21) • Non-UTeach Graduates (N=15) • Novice teachers (most 0-3 years exp) • Math, science, and computer science classes

  11. Pilot Study

  12. Pilot Study • After starting out at similar levels, UTeachers gain higher UTOP scores over time • Teaching experience significant predictor of UTOP scores for UTeachgroup (p < .05) • Noyce Scholars rated significantly higher on UTOP than other groups, (p < .01) • Key Question: Is the UTOP a valid and reliable instrument that measures important components of effective teaching?

  13. NMSI/MET Study UTOP study conducted in partnership with the Gates Foundation’s Measures of Effective Teaching project, and the National Math and Science Initiative Examine reliability, consistency, factor structure Connect teaching behaviors on UTOP to teacher value-added gains

  14. The MET Project • 3000 teachers from 7 school districts, 7 states • Various subjects (mathematics, English, science) and grade levels • Multiple measures of effectiveness (observations, value-added, student surveys, teacher exams) • Multiple video lessons of each teacher • Multiple classroom observation instruments • Charlotte Danielson’s FFT • CLASS protocol • MQI Rubric • UTOP

  15. NMSI/MET Study • 99 raters (math and science master teachers with LTF), scored 994 video lessons of 250 teachers using UTOP • All lessons grades 4-8 mathematics • One third of videos double-scored

  16. Results • Most of the 4-8 math video lessons from this national sample did not score highly on the UTOP • Many middle school math teachers teaching problematic content; many formulaic/key word type approaches. • Raters identified problematic content issues in around one halfof all lessons

  17. Results Surface-level engagement often seen, but little emphasis on conceptual understanding “Orderly but unambitious”

  18. Instrument Reliability Can we consistentlymeasure teaching effectiveness, beyond the biases of individual raters, or the characteristics of particular lessons? Goal: 60-80% of the variance in teacher scores on the instrument attributable to the stable characteristics of the individual teacher

  19. Instrument Reliability • Implementation Results: • 2 observations per year, with 2 raters present at each observation (UT Pilot study) • 4 observations per year, with 1 rater present at each observation (MET Study)

  20. Value-Added Correlations • Are the teaching behaviors measured on the UTOP associated with higher student learning gains, on standardized assessments and tests of conceptual understanding?

  21. Value-Added Correlations This graph is copied from the released 2012 MET Report

  22. Factor Analysis of UTOP • Cluster 1: Fostering Surface Engagement • On task & involved • Class management • Group work • Lesson organization • Cluster 2: Fostering Deep Conceptual Understanding • Inquiry/investigation • Higher-order questioning • Intellectual engagement • Cluster 4: Making Content Connections • To real world (authentic) • To “big picture” • To history/current events • Cluster 3: Content Accuracy and Fluidity • Verbal & written accuracy/fluidity • Effective use of abstraction

  23. Factor 1: Fostering Surface Level Engagement Classroom management Majority “on task” Group-work dynamic Time management Lesson Organization Appropriate Resources Issues of equity & access Teacher critical of lesson

  24. Factor 2: Fostering Deep, Conceptual Understanding • Students generate ideas/conjectures • Students intellectually engaged • Students explore content • Use of higher-order questions • Use of inquiry/investigation

  25. Factor 3: Content Accuracy & Fluidity • Accurate written content information • Accurate & fluid verbal communication of content • Appropriate use of abstraction

  26. Factor 4: Content Connections Connect content to “real world” and other disciplines Connect content to history & current events Connect content to the “big picture” of the discipline

  27. Summary & Implications UTOP measures 4 factors of effective teaching UTOP has reasonable correlations with value-added – may better detect strong teachers Need multiple observations, multiple raters to conduct classroom observation Multiple measures of teaching effectiveness (value-added, observations, student surveys, teacher exams, etc.)

  28. Future Directions Connect specific teaching behaviors to teacher value-added – what really matters? Investigate why the UTOP might be more effective at identifying excellent teaching Use of UTOP to compare classrooms at project-based school (with UTeach graduates) to those at traditional school, same low income school district

  29. Current Study • Manor ISD • Working at 3 High schools – Traditional, Project- and Problem-based and Accelerated Credit Academy • Baseline 2011 – 2012 at MHS and MNTHS • 6 observations per teacher • Interviews of teachers • Student focus groups • How to use UTOP + for Professional Development? • Expand pool size • Train teachers to use for self-assessment and group-assessmemt • Facilitate Lesson Study in Professional Learning Groups • Reliability Audits with UTeach UTOP experts • Targeted PD sessions for summer 2013 with input from teachers • Funding to expand model with new partners

  30. Today’s training agenda • Learn to use UTOP by using the tool • UTOP Video version • UTOP Manuals – Full and Video • UTOP Full version • Videos at http://www.timssvideo.com/ • 8th grade Mathematics lesson – US3 Exponents • 8th grade Science lesson – AU4 Energy Transfer • 8th grade Mathematics Lesson – US1 Graphing Linear Equations • High School Mathematics lesson – Lesson Lab GH 06111612.mov

  31. Next steps / feedback • What do you think? • Need for specific examples from different domains • How to recruit / support additional partners • Follow – up training ala Reliability Audits for Manor study

  32. Rating Mathematics and Science Lessons Using the UTeach Observation Protocol (UTOP) First Day of Training

  33. The UTeach Observation Protocol • You will rate each indicator with a 1-5 rating, typed into box on word document 4

  34. The UTeach Observation Protocol • You also type in 1-5 sentences of supporting Evidence into the “Evidence” box to justify each numerical rating. 4

  35. Rating Process • Supporting Evidence should be: • Specific – based on specific quotes and interactions from video • No opinions! You must justify your rating based on evidence. • Somewhat brief – try to average 3-4 sentences • BUT detailed enough to demonstrate how carefully and thoughtfully you watched the video or observed and took good field notes in the classroom

  36. Rating Process • Supporting Evidence will be used: • By You: To determine your numerical ratings. Read your evidence, compare it to the rubric and examples, and then determine a rating. • By Your Group Members: To understand specific comments or events that occurred which convinced you to assign a particular score to a particular indicator • By UTOP Reliability Auditors: If you participate in future research studies using the UTOP, you will need to revalidate and recalibrate your skills • Co-observations with external UTOP evaluators • On-line modules with database recording your work

  37. Rating Process • Poor Supporting Evidence: • Is too brief • Is too vague • Is clearly biased or opinionated • Is not really related to the indicator’s intent • Does not provide adequate detail for good teacher feedback [Assuming the use if for ongoing professional development]

  38. Rating Process • Good Supporting Evidence: • Offers some specific examples/evidence, but not so many it becomes too long • Is concise, but not too brief to be uninformative • Is evidence - does not reflect an opinion • Is based on knowledge that all experienced teachers are expected to have of good practices • Is clearly related to your numerical score

  39. Rating Process • Several web resources are available for you to assist you in rating each indicator • Description: Overview of intent of indicator and general types of appropriate evidence (link).

  40. Rating Process • Several web resources are available for you to assist you in rating each indicator • Rubric: General rating guidelines for 1-5 ratings for that indicator (link).

  41. Rating Process • Several web resources are available for you to assist you in rating each indicator • Specific Rating Examples: Examples of evidence going with 1-5 ratings, for both mathematics and science (link).

  42. Rating Process • Synthesis Ratings • Each of the 4 sections concludes with a 1-5 synthesis rating. • Take into account all evidence and all ratings from that section, and choose which descriptor best characterizes the lesson.

  43. Rating Process • Synthesis Ratings • Not a numerical average – “human average” where you weight the relative importance of the different indicators • Type an “x” by your choice • No additional supporting evidence needed

  44. Rating Process • The final 2 sections of the UTOP are • Lesson Description: Discuss purpose of lesson, content was covered, learning activities and pedagogical approaches. (4-6 sentences) • Concluding Remarks: Value judgment of the lesson & relative importance of the ratings made - includes comments that did not fit into preceding sections. (3-4 sentences)

  45. Rating Process • The UTOP training manual contains all this information; it is available: • As a complete word document (60 pages) • Online with a clickable contents and interface, at http://www.cwalkington.com/utop

  46. Indicators in 1: Classroom Environment 1.1 The classroom environment encouraged students to generate ideas, questions, conjectures, and/or propositions that reflected engagement or exploration with important mathematics or science concepts. • Make note of: • Instances during the lesson where you observe students generating ideas, questions, conjectures, or propositions. • Whether the teacher is actively encouraging these contributions • Special Considerations: • Keep in mind that giving a simple response to a direct teacher question is not really “generating an idea,” and that asking a simple clarification question is not what we have in mind as reflecting engagement with mathematics or science concepts.

  47. Indicators in 1: Classroom Environment 1.2 Interactions reflected collegial working relationships among students (e.g. students worked together productively and talked with each other about the lesson). • Make note of: • Whether students are talking to each other about topics relevant to the lesson • Whether students are having off-topic conversations in groups • Whether the teacher is making explicit moves to promote group work skills • Whether the student groups are effectively distributing roles • Whether all students in the group are involved in solving each problem • NA Rating: • Rate as NA if the lesson did not include group work. • Student-student interactions greater than 3 minutes total over the entire lesson

  48. Indicators in 1: Classroom Environment 1.3 Based on conversations, interactions with the teacher, and/or work samples, students were intellectually engaged with important ideas relevant to the focus of the lesson. • Make note of: • Whether students are making quality contributions during whole-class portions and small group portions of the lesson • Whether students are thinking about the concepts themselves, or just following along with what the teacher is telling them to do • Whether the students are giving contributions that clearly show they’re not thinking deeply about or understanding the content • Contributions that evidence intellectual engagement rather than surface level engagement

  49. Indicators in 1: Classroom Environment 1.4 The majority of students were on task throughout the class. • Make note of: • Whether students are actively participating (raising hands, asking questions) in whole class portions of lesson. • Whether students are doing assigned work during individual/group portions of lesson • Any off-task behavior evidence: socializing, head down, work for other class, etc. • Special considerations: • The benchmark for a “3” is 75% on task • You can only rate based on what you see/hear!

  50. Indicators in 1: Classroom Environment 1.5 The teacher’s classroom management strategies enhanced the classroom environment. • Make note of: • Setting expectations & communicating objectives • Foreseeing problems • Managing behavior • Student seating & group selection • Managing student movement and participation norms • How the teacher ensures positive student-student interactions • Getting student attention and participation when needed • Special considerations: • Some teachers will manage every hint of off-task behavior, while others will let huge violations slip by. Look not only at what the teacher is saying, but whether its appropriate based on what the students are doing.

More Related