1 / 68

Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations

Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations. by Kathy Gates University of Mississippi. Introduction. UM began a serious assessment of its teacher evaluation process in 1998.

phillipss
Download Presentation

Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations by Kathy Gates University of Mississippi Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations

  2. Introduction • UM began a serious assessment of its teacher evaluation process in 1998. • UM moved to web-based delivery of results in Fall 1999 and web-based collection of responses in Fall 2003. • While not perfect, the resulting process is generally perceived as being successful, and our experiences hold valuable lessons for others. Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations

  3. UM at a Glance • Located in Oxford, MS about 80 miles south of Memphis, TN • About 14,500 students • Converted to SAP’s HR, Finance, Plant Maintenance modules in 1999 & 2000 • North American pilot for SAP’s student system, Campus Management; went live in April 2003 Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations

  4. Agenda • Historical Perspective • Teacher Evaluations @ UM – Phase 1 • Presentation of Results via Web • Teacher Evaluations @ UM – Phase 2 • Collection of Responses via Web • Summary Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations

  5. First used North America in the mid-1920’s Subject of a huge body of literature Nearly 2000 studies as of 1997 Most extensive area of study in higher education About 30% of colleges and universities used some form of teacher evaluations 30 years ago, and almost all do now. “Many researchers have concluded that the reliability and validity of student ratings are generally good … however, many controversies and unanswered questions persist.” From “Navigating Student Ratings of Instruction” by Apollonia and Abram in the November 1997 issue of American Psychologist. Traditionally a Controversial Topic Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations

  6. Grade Inflation Students’ evaluative ratings of instruction correlate positively with expected course grades. Are ratings influenced by grades? Are grades influenced by ratings? Opportunity for Retaliation by Students Students Rate Different Academic Fields Differently May Lead to Superficiality of Course Content Ratings Based on Popularity not Teaching Effectiveness “Instructors who are skilled in the art of impression management are likely to receive high student ratings whether or not their students have adequately mastered course materials.” From “Instructor Personality and the Politics of the Classroom” by John C. Damron, Douglas College, Canada Must keep students entertained Faculty Concerns Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations

  7. Faculty Concerns, cont. • “In my office, I have a folder that contains two items: (1) a copy of the Rutgers Student Instructional Rating Survey, and (2) a Customer Satisfaction form I once took away from a Holiday Inn. Making due allowances for the difference in goods and services provided, they are the same form, produced by the same logic of market forces and consumer relations.” • From “Why We Should Abolish Teaching Evaluations” by William C. Dowling Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations

  8. More Faculty Concerns • “Student Teaching Evaluations: Inaccurate, Demeaning, Misused -Administrators love student teaching evaluations. Faculty need to understand the dangers of relying on these flawed instruments.” • Article by Gray and Bergmann in the September-October 2003 issue of Academe Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations

  9. Past Chronicle Discussions • “We must recognize that student evaluations have led to grade inflation and the erosion of academic standards.” • “When student evaluations contribute to tenure, promotion and salary decisions, the gross injustice of the present system becomes intolerable.” • “Too many have been grievously wronged by the institutional discrimination that has resulted from the use of such ill-conceived and poorly utilized instruments as student evaluations.” • “Student evaluations must go, and they must go now.” • From “Student Evaluations of Their Professors Rarely Provide a Fair Measure of Teaching Ability” by Louis Goldman in the August 8, 1990 issue of the Chronicle of Higher Education Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations

  10. “Does he assume that all deans, chairpersons, and faculty-committee members are too stupid to read the evaluation results intelligently?” “Or does Professor Goldman also think so little of students that the thousands of hours they spend in class do not entitle them to an opinion which is worthy of being taken seriously?” “In 30 years in higher education, as a professor and an administrator, I have yet to see anyone ‘grievously wronged’ by the use of student forms about teaching.” “After 20 years of undergraduate teaching with careful attention to a variety of evaluation instruments completed by my students, I am convinced that I have improved by working on the inadequacies identified by the students … I value my students’ assessment. In my experience, they have generally been more perceptive than I anticipate and more generous than I deserve.” “If there is anything the research is agreed upon, it is that student ratings are statistically reliable.” Responses … Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations

  11. Student Perspectives • “But valid student criticism of a professor is also a factor in my decision to take a class. If four out of six students tell me to avoid a professor and outline specific reasons why, then I will try to avoid the professor’s class.” • From “Letters to the Editor” in the April 25, 1997 issue of the Chronicle of Higher Education. Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations

  12. Faculty Views on Student Comments • “Yet, inescapably, life constantly evaluates us: We’re served divorce papers, we learn our cholesterol level, we get turned down for tenure. Or we get promoted, our poem gets published, we win in the over-40 age division in our local 10K race. We have to carry those moments with us, too, however difficult that may be. Too often the good in life seems temporary, while the bad stuff goes on and on. So it is with teaching evaluations. I’m sure I’ve forgotten many wonderful things that students have said about me over the years, but the zingers stick to me like burrs.” • From “Why I Stopped Reading My Student Evaluations” by Lucia Perillo in the July 7, 2000 issue of the Chronicle of Higher Education Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations

  13. Responses … • “Not reading the forms is succumbing to the misguided belief that there is nothing to be learned from … those we say we respect, our students.” • “Anonymous student evaluations of instruction are a failed experiment. They have no credibility with anyone except those who devise them … These types of evaluations should be discarded immediately.” Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations

  14. Student Comments, cont. • “Granted spelling and grammar are often woeful. In my 16 years as department chairperson, I had to assess remarks like, “he has no patients” and, in the case of a colleague given to cuss words, “he used profound language,” presumably teaching his “coarse.” • “Most students are fair-minded, and their kind comments generally make for gratifying reading.” • “I am a true believer that evaluations generally give an accurate collective assessment of a course by the participants – our customers, if you will.” • “I still believe that we could learn a lot from each other.” • From “What Students Can Teach Professors: Reading Between the Lines of Evaluation” by Douglas Hilt in the March 16, 2001 issue of the Chronicle of Higher Education Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations

  15. Student Comments, cont. • “Few of us will ever get the sexist but enthusiastic evaluation that one University of Illinois instructor received: ‘My French teacher,’ it read, ‘is drop-dead gorgeous, and I think I’m in love with her.’” • “Regardless of how colleges evaluate teaching, everyone’s concerned about evaluation.” • From “Student Evaluations Deconstructed” by Joel J. Gold in the September 12, 1997 issue of the Chronicle of Higher Education. Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations

  16. Usefulness of Student Comments • “Anecdotal comments are at least as useful as numerical ratings, so the evaluation should be constructed to prompt appropriate comments from students.” • From the Report of the Joint Commission on Evaluation of Instruction” at Kent State University Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations

  17. Benefits Timely Feedback of Evaluation Results Flexibility in survey design and development Convenience for Students Increase in written comments Data Warehousing Challenges Response Rates Culture Change Concerns about Privacy and Confidentiality Use of E-mail for Participation Announcements Web-Based Evaluations From “Web-based Student Evaluation of Instruction: Promises and Pitfalls” by McGourty, Scoles, and Thorpe presented at the 42nd Annual Forumof the Association for Institutional Research, June 2002. Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations

  18. Web-Based Evaluation, cont. • “For this institution, female students were significantly more likely to respond to the web-based course evaluation survey than men.” • “In conclusion, the results of this study suggest that concerns regarding low response rates and the potential for non-response bias in web-based course evaluation methods may not be warranted …” • From “Online Student Evaluation of Instruction: An Investigation of Non-Response Bias” by Stephen Thorpe presented at the 42nd Annual Forum of the Association for Institutional Research, June 2002. Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations

  19. Note the “hotness” factor.

  20. Did you know that your college/university already has web-based teacher evaluations?

  21. Purposes Used to improve teaching (formative) Used in personnel decisions (summative) Serves as an opportunity for students to provide feedback Used as a student consumer guide Goal To make evaluations useful to three constituencies Faculty Students Administrators Teacher Evaluations @ UM Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations

  22. Phase 1: Presentation of Results via Web • Deployed in Fall 1999 • Developed in cooperation with a Faculty Committee on Evaluation of Instruction • Featured graphical display of results with reference groups • Also featured VIP Reports • Access limited to “olemiss.edu” domain • Instructors given opportunity to withhold results from the public view Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations

  23. Print-friendly format. Note the response rate. Note the reference group. Note the user cautions.

  24. Lower division ~ Liberal Arts This class.

  25. VIP Reports • Summary Reports for Deans, Chairs, and other Administrators • Report Types • Comparisons by School/College • Comparisons by Department • Sorted Responses by Question by Department • Course Evaluations Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations

  26. Lower division ~ Liberal Arts Lower division ~ UM

  27. Technology Choices • Scantron forms for collection of responses • Forms printed from legacy mainframe system • Scanned results uploaded into mySQL database • CGI scripts to display results • Graphs generated in real-time using the gd GIF library • See http://www.boutell.com/gd/ Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations

  28. Advantages Improved information access for all constituencies Previously, students only had access through paper print-outs in the library. Disadvantages Collection of results costly, error-prone and labor-intensive Slow turn-around time Scanning equipment failures Takes up class time No anonymity for written responses Phase 1 Results … but then scandal strikes the teacher evaluation process @ UM … Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations

  29. Scandal leads to progress!

  30. Phase 2: Collection of Responses via Web • Deployed in Fall 2003 • Timing was influenced by • migration to SAP’s Campus Management student system in Spring 2003 and • retirement of legacy mainframe in December 2003 • Focus groups with students and instructors in Summer 2003 Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations

  31. Fall 2003 Features • WebID authentication • Back-end database converted from mySQL to Oracle for better performance • Student comments in public reporting interface • New “Amazon-style” question: “What do you want other students to know about your experience in this course.” • Overall five-star rating • Student comments in faculty/VIP reporting interface • Better support for courses with multiple instructors • “Last chance to evaluate” opportunity before viewing final grades Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations

  32. Go to comments Stars filled dynamicallyto 1/100th point accuracythanks to UM graduate intern!

  33. Must study and read inorder to pass!

  34. Fall 2003 Concerns • Reduced response rate when going from paper process to on-line • Dropped to about 30% • Loss of “middle” sample set due to voluntary participation • Will results be skewed if only those with strong feelings one way or the other participate? • Time frame for capturing responses has changed – will this affect the outcome? Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations

  35. Participation Incentives Students who completed 100% of their Spring 2004 evaluations will get to register one day early in Fall 2004 priority registration. For first seven days of grades viewing, students who had completed at least 50% of their evaluations had option to go straight to grades viewing; afterwards all had this opportunity. New Features Tool for department chairs to mark courses as exempt/non-exempt. Time stamps on responses Variable questions by course ~ pilot project Changes for Spring 2004 Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations

  36. Link to go straight to final grades made visible or suppressed based on outstanding evaluations.

  37. Wins • The rate of participation increased to about 58-60%. • 29,146 responses in Spring 2004 with on-line process vs. 25,578 in Spring 2001 with paper process • Remember that Spring 2002 evaluations were shredded, so that comparison group was lost! • The average scores remained similar to that for previous semesters. • Very little bias is introduced by the online process. • “Live” comments feature seems to have value for both students and instructors. Dot COM Meets Dot EDU: Giving Life to Online Teacher Evaluations

More Related