online course evaluations lessons learned n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Online Course Evaluations: Lessons Learned PowerPoint Presentation
Download Presentation
Online Course Evaluations: Lessons Learned

Loading in 2 Seconds...

play fullscreen
1 / 55

Online Course Evaluations: Lessons Learned - PowerPoint PPT Presentation


  • 156 Views
  • Uploaded on

Online Course Evaluations: Lessons Learned. With a cast of thousands, including: Susan Monsen , W. Ken Woo , Carrie Mahan Groce,& Wayne Miller. Online Course Evaluations: Lessons Learned. Susan Monsen. Yale Law Experience. Course Evaluations were run by Student Representatives

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Online Course Evaluations: Lessons Learned' - aimee


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
online course evaluations lessons learned

Online Course Evaluations: Lessons Learned

With a cast of thousands, including: Susan Monsen, W. Ken Woo, Carrie Mahan Groce,& Wayne Miller

yale law experience
Yale Law Experience
  • Course Evaluations were run by Student Representatives
  • Introduced first online system 2001
  • Changed system twice and introduced incentives
  • For Spring 2005 have 90% response rate
yls oce version 1
YLS OCE Version 1
  • First online course evaluation (OCE) Fall 2001-Spring 2003
  • Home grown web application with 18 questions
  • System did not scale for in-class completion
  • General email reminders sent to all students
  • No incentives
  • Response rate less than 20%
back to paper
Back to Paper

Returned to Paper after 3 semesters use

Reasons:

Low response rate

Wanted an easier to use interface for completing and viewing results

Wanted ability to add incentives

oce version 2 design
OCE Version 2—Design
  • Design with input from student representatives and faculty
  • Modeled after Yale College system
  • Reduced the number of questions to 8
  • Added a comment question
  • Students with evaluations to complete received weekly email reminder
incentives
Incentives
  • Tested Class Time for Completion
    • Worked for small-midsize classes
    • Response rate about 90%
    • Load testing indicated up to 75 simultaneous users.
  • Introduced Grade Blocking
    • Students see an “*” instead of grade for those classes not evaluated.
what did we learn
What did we learn
  • Don’t
    • Too many questions
    • No automated reminders
    • No incentives
  • Do
    • Incentives work!
    • Reminders help
    • Load test system
ctes online
CTEs Online

Presented by:

Ken Woo

Director, Law School Computing

Northwestern University School of Law

slide13
When?
  • 1st Semester : Spring 2004
  • 2nd Semester : Fall 2005
  • 3rd Semester : Spring 2005

Only 1.5 years into it Online

when continued
When? (continued)
  • Paper system : Fall 2003 80%
  • Paper system : Spring 2003 77%
  • Paper system : Fall 2004 70%
  • 1st Semester : Spring 2004 N/A
  • 2nd Semester : Fall 2005 70%
  • 3rd Semester : Spring 2005 67.8%
slide15
Why?
  • Wanted to push everything onto the Web.
    • Everyone had some sort of web access
    • Loose papers and go paperless
  • Centralized storage location
    • On a centralized server
    • No Data Steward available
    • Access by Registrar and Registrar Team only
    • Professors can view own results
why continued
Why? (continued)
  • Perceived as easier to manage
    • Changes were easier for Registrar
    • 3 types of forms
      • Standard (19 questions)
      • CLR (23 questions)
      • Clinic (18 questions)
    • Legibility was a small issue
lessons learned
Lessons Learned
  • Very similar to paper questions with some added questions for clarity
  • Participation rate is falling
  • Some ideas to increase participation
    • Withhold transcripts – no
    • Withhold final grades – no
    • Let know, no view of any results if no participation – next semester Fall 06
slide18
Q & A

CTEs Online Presented by:

Ken Woo

Director, Law School Computing

Northwestern University School of Law

university of denver sturm col experience
University of DenverSturm COL Experience
  • Why Online Evaluations
    • Academic Dean was the instigator. Wanted better, more timely, access to evaluations, particularly comments.
    • Hoped to get more meaningful written comments, both good and bad.
    • Our school has a culture of use of written comments by students and search committees.
university of denver sturm col experience1
University of DenverSturm COL Experience
  • Web Manager built homegrown Cold Fusion application using current evaluation form and procedures as a start.
  • Data pulled from administrative (Banner) system.
  • Course and student data stored in one database, results in a separate db (anonymity).
  • Questions generate dynamically.
university of denver sturm col experience2
University of DenverSturm COL Experience
  • Initial concerns taken into account.
    • Faculty - only registered students, one per student. No evaluation after exam.
    • Students – retain anonymity, no faculty access before grades.
  • Additional Student Concern
    • Complained this format would be too time consuming – not addressed, later feedback suggests students appreciate freeing up class time.
university of denver sturm col experience3
University of DenverSturm COL Experience
  • Additional Faculty Concerns – how addressed
    • Lower response rates – pilot conducted to get a feel for response rates before faculty approval of online evals.
    • Concern that comments would be too accessible leaving “less popular” professors vulnerable – agreed that Academic Dean could remove very negative comments from public view.
    • Not all courses followed standard exam schedule – handled case by case.
university of denver sturm col experience4
University of DenverSturm COL Experience
  • Assoc. Dean wanted data to take to faculty – came to Ed. Tech.
    • Started with pilot group in Fall 02 – 7 profs, 10 course participated.
    • Spring 03 all adjuncts and a handful of appointed faculty – 80 courses in all
    • Summer 03 all courses participated.
university of denver sturm col experience5
University of DenverSturm COL Experience
  • Evaluation Procedures
    • Evaluation goes online 2 weeks prior to semester end – available through the day prior to exams beginning. Originally only last two weeks of class – extended during 1st pilot.
    • Students receive emails with links to all their course evaluations and detailed instructions.
    • Reminder emails sent every other day or so to those who have not completed.
university of denver sturm col experience6
University of DenverSturm COL Experience
  • Results from pilots encouraging. Response rates good (higher than paper), though inflated due to incentives and babysitting.
  • Summer low but very short evaluation period.
  • Dean took data to faculty for approval to move all courses online. Approval given beginning Fall 2003.
university of denver sturm col experience7
University of DenverSturm COL Experience
  • Response Rates - real use setting
university of denver sturm col experience8
University of DenverSturm COL Experience
  • Reasons for drop in response rates - speculation
    • Change of Academic Dean. Current dean not invested, less hands on encouragement.
    • Novelty wearing off. This year we had our first incoming class who never did a paper evaluation. No novelty factor – just another chore.
university of denver sturm col experience9
University of DenverSturm COL Experience
  • What should we do?
    • Nothing? Assessment department happy with 70% and we are getting better rates than other divisions.
    • My preference – get the new dean back on board, even more reminders, advertisement.
    • Better communication to faculty about timing so they can tell students what to expect.
university of denver sturm col experience10
University of DenverSturm COL Experience
  • Next steps
    • More sophisticated results generation. Advanced searching: ability to compare profs side by side, show all evals for a professor or a course.
    • Streamline course list interaction. Build direct access to Banner system rather than pulling data out of the admin system. Not likely to happen.
    • Move from Access back-end to SQL Server.
university of denver sturm col experience11
University of DenverSturm COL Experience
  • Potholes to watch out for.
    • Difficult to know how good the data is. We realized late that the person pulling lists didn’t have permissions to get non-law students enrolled in law classes. No way to know that from looking at such large amounts of data. 150 courses/nearly 5000 individual evaluations.
    • Different schedules for different courses can cause headaches. 1st year Legal Writing wanted complete control over timing. Some courses finish early. Hard to keep those in institutional memory. Anytime an individual eval has a different schedule response lower.
university of denver sturm col experience12
University of DenverSturm COL Experience
  • Potholes (cont.)
    • Complete anonymity made a few instances of students filling out one evaluation as though it was for a different professor tedious. Mostly resolved by adding the professor’s name throughout the text of the eval, in as many places as possible.
    • Students want to retract an evaluation (usually negative). This semester was the first time we heard this request. Academic dean turned down all requests and shut the door to additional requests.
university of denver sturm col experience13
University of DenverSturm COL Experience
  • And a sink hole…
    • A more pervasive problem: with any ed tech project, once we do something it becomes “ours.”
    • Problematic because we don’t have the staff to take on administrative functions, nor have we been given the power to handle issues with those functions.
university of denver sturm col experience14
University of DenverSturm COL Experience
  • Remedies?
    • Proactive – never take too much control of a project. Build as much administrative functionality in as possible at the beginning.
    • If you’ve taken on too much - give it back, if it was their job before it was online, it should still be their job.
    • Easier said than done.
university of denver sturm col experience15
University of DenverSturm COL Experience
  • Final words of wisdom
    • Don’t try to reinvent the wheel. We found we had better buy-in when we agreed to keep system as close to original as possible.
contact information
Contact information

Carrie Mahan Groce

Web Manager

University of Denver Sturm College of Law

cmgroce@law.du.edu

303.871.6098

the duke law experience
The Duke Law Experience
  • Introduced Summer 2003 without much planning when scantron equipment failed and replacement was deemed too expensive
  • My motivation was to provide a service to the law school that would benefit all: more efficient for staff and students; unmediated access for faculty; better community access to public information (summaries)
the duke law experience1
The Duke Law Experience
  • Homegrown, PHP-based survey software was employed
  • Student Information System provided rosters
  • Local email system provided authentication (through LDAP) for both students and faculty
shortsightedness
Shortsightedness….
  • Paper form was copied without re-evaluation
  • 10 minutes for in-class completion of paper evaluations was “given back” to faculty
  • Incentives for students were not thought through
slide42

Scale changes

are very

problematic

things we designed right
Things we designed right
  • Registrar has direct control over which classes are included; which faculty are associated with each class; etc.
things we designed right1
Things we designed right
  • Students can submit “conditional evaluations” when they fail to log in correctly or are not in our roster
things people want
Things people want
  • Students want to be able to edit and save, and come back to evaluations
  • Registrar and some faculty members want individualized time windows for certain classes
student response rate
Student Response Rate
  • 70% response rate required to share course eval summaries with community
  • Students need constant cajoling or we need to provide a better incentive
  • Some faculty are apprehensive about including students who would not have been in attendance on day of paper evaluations, and uneasy about cajoled students
student response rate3
Student Response Rate

Time scheduled for evals in large classes

student response rate4
Student Response Rate

Automated and person-specific email from Associate Dean

student response rate5
Student Response Rate

Second automatic email from Associate Dean

and cajoling email from Registrar

incentives under consideration
Incentives under “consideration”
  • Withhold registration for following semester
  • Withhold grades
  • Withhold free printing
  • Withhold firstborn….
issues
Issues
  • Security – not discussed much, but was a big part of planning
  • Privacy – deal breaker for some students; responses are anonymized before release
  • Accuracy – faculty are suspicious of mix-ups; varying scales have confused students
  • Urban legends – stories abound among faculty about how Prof X saw everyone’s evaluations, etc.
future
Future
  • Evaluation form is being reworked: easier to fill out, less confusing
  • Incentives are being considered
  • Scantron on/off-line solutions are being weighed
  • Support could at any point be withdrawn –
  • And probably would have been, were another solution easy to implement….
contact information1
Contact information

Wayne Miller

Director of Educational Technologies

Duke University School of Law

wmiller@law.duke.edu

919-613-7243

http://edtech.law.duke.edu/