1 / 27

Conversion of Faculty Evaluations to an On-Line Format

Conversion of Faculty Evaluations to an On-Line Format. Catherine Hackett Renner SUNY Geneseo Larry Piegza Gap Technologies, Inc. OnlineCourseEvaluations.com. The Paper Process. ~22,000-25,000 forms per term were hand sorted into course sections

albert
Download Presentation

Conversion of Faculty Evaluations to an On-Line Format

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Conversion of Faculty Evaluations to an On-Line Format Catherine Hackett Renner SUNY Geneseo Larry Piegza Gap Technologies, Inc. OnlineCourseEvaluations.com

  2. The Paper Process • ~22,000-25,000 forms per term were hand sorted into course sections • Distributed to faculty the last 2 weeks of classes • Faculty returned them to IR Office • Forms were separated, scanned, then hand sorted back into course sections • Invariably evaluations would be turned in late • Faculty received the results back mid-February

  3. Complaints with Process • Faculty • Time delay in getting results back • Late returns were not processed • Students • Anonymity • Detail in comments • Classroom time did not allow for thoughtful detail in comments • Convenience • Would like to see results online

  4. Conversion to On-line Format • Internal conversion was not feasible • Migration to new system • Internal survey software was not secure enough for this process • Decision was made to outsource • Constraints on company • Had to use our own instrument • Had to be able to conform to the levels of access that were dictated by Faculty Senate

  5. Company Choice • Best met our needs • Could take our current evaluation system and upload it • Could reproduce our levels of access • Very customer-oriented

  6. Spring 2006 Pilot • Students enrolled in Psychology classes and Astronomy Laboratories provided their Student Evaluation of Faculty Instruction (SOFI) ratings in an on-line format rather than in-class paper format • 51 course sections • 1,199 individual students providing1,806 evaluations • These 1,199 students were also able to compare the on-line version to in-class paper format (used in their other classes)

  7. Getting the Word Out • Students were informed of the pilot by their course instructors at the beginning of the semester • One week prior to the SOFI evaluation period informational flyers were posted within the classrooms of the 51 sections • Article on the process was published in the student newspaper • Student Government Association endorsed the process and was involved with disseminating information • On the first evaluation day faculty were given handouts to distribute to their students explaining the process • The evaluation period started on April 17th and ended on May 3rd

  8. Pilot Procedure • Students received an e-mail with their Login ID and password • E-mail contained direct link the login webpage • E-mail reminders 48 hours • Reminders stopped once all evaluations were performed • Faculty received e-mails requesting they remind the students of the evaluation period • Within the e-mail faculty also received the response rate for each of their courses

  9. Pilot Results • The response rate for the pilot was 76%! • This was consistent with the response rate for the in-class SOFI (70-73%) • Some bumps in the road • All occurred within the first 2 days • All were minor • All were resolved in less than 24 hours

  10. Student Satisfaction Survey • Seven survey questions designed to assess the following: • Anonymity • Detail of Comments • Convenience • Choice • This final question was designed to be the “litmus” test of the pilot • This question asks students which format they would prefer to use if given the choice

  11. Anonymity

  12. Detail of Comments

  13. Convenience

  14. Results

  15. Results

  16. Choice

  17. Decision • We went to a full implementation Fall 2006 • 80% response rate • Smooth run

  18. What SUNY Geneseo Did Well • Create a login link from their web portal • Erased student confidentiality concerns by selecting an off-site vendor  • Communicated with their teachers • Shared results to their students 

  19. What SUNY Geneseo Did Well • Sent out regular emails to their students • Have teachers hand out directions • Ran ad in school newspaper  • Mobilized their student government

  20. Top 10 Things Not To Do When Going Online: 10) Have a one week evaluation window for your first semester 9) Decide on a vendor two weeks before your evaluation starts 8) Spam your students 7) Change your course of action based on the teacher suggesting things at the current time

  21. Top 10 Things Not To Do When Going Online: 6) Run concurrent with paper

  22. Top 10 Things Not To Do When Going Online: 5) Buy a system that doesn't allow your students to click a "I dropped this class" button. 4) Don't communicate to your teachers 3) Ignore your vendor’s advice / Go it alone 2) Ask 100 questions on each survey

  23. Top 10 Things Not To Do When Going Online: 1) Send a two page email lecturing your students on their civic responsibility to fill out course evaluations

  24. 8 Must Haves 1) Patience to change your school’s culture

  25. 8 Must Haves 2) Eliminate common questions for team taught courses 3) Prizes to give away to students 4) Share results to students 5) Allow teachers to email to their students 6) Have a contest

  26. 8 Must Haves 7) Follow-up question technology Did the instructor speak audibly and clearly? What could the instructor have done to speak more clearly? • Get a microphone • Not face the blackboard • Speak slower • Speak faster

  27. 8 Must Haves 8) Dropped class survey What was the primary reason why you dropped this class? • Course related • Instructor related • Financial issue • Scheduling issue • Something about me • Other

More Related