1 / 40

Development and Deployment of a Web-Based Course Evaluation System

This paper discusses the development and deployment of a web-based course evaluation system that aims to improve accountability and efficiency in gathering student feedback. It addresses the limitations of the paper-based system and highlights the goals and benefits of the web-based system. The challenges faced in implementing the system, including student issues and administration and faculty concerns, are also explored. Overall, the web-based system offers a more convenient and effective way of collecting and analyzing course evaluations.

brewerl
Download Presentation

Development and Deployment of a Web-Based Course Evaluation System

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Development and Deployment of a Web-Based Course Evaluation System Jesse Heines and David Martin Dept. of Computer ScienceUniv. of Massachusetts Lowell Miami, Florida, May 26, 2005

  2. The All-Important Subtitle Trying to satisfy ... • the Students • the Administration • the Faculty • and the Union presented in a slightly different order from that listed in the paper

  3. The All-Important Subtitle

  4. The All-Important Subtitle

  5. Paper-Based System Reality • Distributed and filled out in classrooms • Thus, virtually all students present that day fill them out • However, absentees never fill them out

  6. Paper-Based System Reality • Distributed and filled out in classrooms • Collected but not really analyzed • At best, Chairs “look them over” to get a “general feel” for students’ reactions • Professors simply don’t bother with them • lack of interest and/or perceived importance • simple inconvenience of having to go get them and wade through the raw forms

  7. Paper-Based System Reality • Distributed and filled out in classrooms • Collected but not really analyzed • Lose valuable free-form student input because those comments are often ... • downright illegible • so poorly written that it’s simply too difficult to try to make sense of them

  8. Paper-Based System Reality • Distributed and filled out in classrooms • Collected but not really analyzed • Lose valuable free-form student input • However, these comments have the greatest potential to provide real insight into the classroom experience

  9. Paper-Based System Reality • Distributed and filled out in classrooms • Collected but not really analyzed • Lose valuable free-form student input • However, these comments have the greatest potential to provide real insight • Bottom Line #1: The paper-based system pays little more than lip service to the cry for accountability in college teaching

  10. Paper-Based System Reality • Bottom Line #2: We’re all already being evaluated online whether we like it or not ...

  11. Web-Based System Goals • Collect data in electronic format • Easier and faster to tabulate • More accurate analysis • Possibility of generating summary reports

  12. Web-Based System Goals • Collect data in electronic format • Easier and faster to tabulate • More accurate analysis • Possibility of generating summary reports • Retrieve legible free-form responses • Allow all students to complete evaluations anytime, anywhere, at their leisure, and even if they miss the class in which the evaluations are distributed

  13. What We Thought If we build it, they will come ... ... but we were very wrong!

  14. Student Issues • Maintain anonymity • Ease of use • Speed of use

  15. Student Issues • Maintain anonymity • Ease of use • Speed of use We guessed wrong on the relative prioritiesof these issues.

  16. Student Issues • Our main concern: • Prevent students from “stuffing the ballot box” • One Student = One Survey Submission

  17. Student Issues • Our main concern: • Prevent students from “stuffing the ballot box” • One Student = One Survey Submission • Major concern that appeared after the system was deployed: • Simply getting students to participate • There appeared to be a great deal of apathy, particularly in non-technical courses

  18. Student Login Evolution Fall 2003

  19. Student Login Evolution

  20. Student Login Evolution

  21. Student Login Evolution

  22. Administration Issues • System quality and integrity • “Buy in” from the deans • But the real issue was ... Dealing with the faculty union

  23. Faculty Issue #1 • Control of which courses are evaluated • Contract wording: “The evaluation will be conducted in a single section of one course per semester. ... At the faculty member’s option, student evaluations may be conducted in additional sections or courses.”

  24. Union Issue #1 • In 2004, all surveys were “turned on” by default, that is, they were all accessible to students on the Web • This was a breach of the contract clause stating that “evaluation will be conducted in a single section of one course” • In 2005, the default is inaccessible • Use of the system thus became voluntary • As of May 20, 2005 (end of final exams), 95 professors (25% of the faculty) in 40 departments had made 244 course surveys accessible to students

  25. Faculty Menu

  26. Faculty Issue #2 • Control of what questions are asked • Contract wording: “Individual faculty members in conjunction with the Chairs/Heads and/or the personnel committees of academic departments will develop evaluation instruments which satisfy standards of reliability and validity.”

  27. Union Issue #2 • In 2004, deans could set questions to be asked on all surveys for their college • This was a breach of the contract clause stating that faculty would develop questions “in conjunction with the Chairs/Heads and/or department personnel committees” • In 2005, all college-level questions are now at the department level so that only Chairs can specify required questions • Deans then had essentially no access to the system unless they were teaching themselves or were the acting chair of a department

  28. Faculty Menu

  29. Faculty Question Editor

  30. Faculty Question Editor

  31. Faculty Add Question Form

  32. Survey as Seen by Students

  33. Faculty Issue #3 • Control of who sees the results • Contract wording: • “Student evaluations shall remain at the department level. At the faculty member’s option, the faculty member may submit student evaluations or a summary of their results for consideration by various promotion and tenure review committees. The faculty member shall become the sole custodian of these student evaluations at the end of every three academic years and shall have the exclusive authority and responsibility to maintain or destroy them.”

  34. Results as Seen by Faculty

  35. Union Issue #3 • Data was collected without faculty consent • This was a breach of the contract clause stating that “student evaluations shall remain at the department level” • All survey response data for the Fall 2004 semester were deleted on February 15, 2005, unless the faculty member explicitly asked that it be kept • What’s going to happen with this semester’s data has not yet been determined

  36. Faculty Menu

  37. Lessons Learned/Confirmed • No matter what you do, there will be those who object You must remain open-minded and flexible • Practice good software engineering so that the software can be easily modified • It’s really worth it to work with the many power factions to garner support • Every system needs a “champion” • Be prepared to spend a huge amount of time on system support

  38. Support, Support, Support

  39. Thank You Jesse M. Heines, Ed.D. David M. Martin, Ph.D. Dept. of Computer Science Univ. of Massachusetts Lowell {heines,dm}@cs.uml.edu http://www.cs.uml.edu/{~heines,~dm}

More Related