1 / 7

Interpreting Student Evaluations

Interpreting Student Evaluations. Heather McGovern, Fall 2011. Quick Advice. Attend to trends—disregard outliers Use whichever is higher—adjusted or raw scores Resist interpreting numbers with too much precision

Download Presentation

Interpreting Student Evaluations

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Interpreting Student Evaluations Heather McGovern, Fall 2011

  2. Quick Advice • Attend to trends—disregard outliers • Use whichever is higher—adjusted or raw scores • Resist interpreting numbers with too much precision • Attend to the context of the institution (level of class, race of faculty member, etc.) and of the person’s teaching • Remember that students cannot provide any valid information on most f our aspects of excellence in teaching with our student rating instrument (or at all)

  3. Myth: Being “similar” is bad • It is “similar.” The majority of faculty nationwide will fit into this band, given that the scores are normed.

  4. FAQ: What does it mean when someone is adjusted down? • It means that the five factors IDEA uses indicate that students in the class were predisposed to give higher ratings. This may be because they were unusually motivated to take the class, perceive themselves as unusually hard working, because the class is smaller, or a combination of these and other factors. • What it does not mean: the teacher did something wrong. • Basic guidance from IDEA: Use the higher score when evaluating an individual faculty member.

  5. FAQ: Why do some people have to use the small class form? • IDEA’s research indicates that fewer than 15 student responses lead to unreliable data. The union and administration agreed to move to the small class form for classes under 15 in order to avoid giving faculty what is essentially “junk” statistical data. IDEA reports the following median rates: 10 raters .69 reliability 15 raters .83 reliability 20 raters .83 reliability 30 raters .88 reliability 40 raters .91 reliability Reliability ratings below .70 are highly suspect.

  6. FAQ: Why does page 3 not highlight an area in which faculty performed well? • Because IDEA’s research hasn’t noted a correlation between that item and the objectives selected. Bottom line: good teachers don’t have to use all the pedagogical techniques all the time (or ever), and you should see IDEA’s guidance as informed guidance, but not as a mandate.

  7. Myth: Can low scores be because a CIP code is wrong? • Probably not. Unless you mean you are really talking about your disciplinary-comparison scores—let’s see who in this room even knows where they can find those on their IDEA report? • That said, a change in CIP code can provide better disciplinary comparison data which you can then consider and point others to.

More Related