1 / 15

Transitioning to IDEA Online – Paths to Follow and Pitfalls to Avoid

Transitioning to IDEA Online – Paths to Follow and Pitfalls to Avoid. Tom Wangler (IDEA On-Campus Director). IDEA Users Group Meeting, May 30, 2010 – Chicago, IL. Why Move to Online Surveys?. Student concerns related to instructors identifying their handwriting

thea
Download Presentation

Transitioning to IDEA Online – Paths to Follow and Pitfalls to Avoid

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Transitioning to IDEA Online – Paths to Follow and Pitfalls to Avoid Tom Wangler (IDEA On-Campus Director) IDEA Users Group Meeting, May 30, 2010 – Chicago, IL

  2. Why Move to Online Surveys? • Student concerns related to instructors identifying their handwriting • Saves money by eliminating paper forms • Saves trees by eliminating paper • Instructors get their 20 minutes back in the class room • No more lost or misplaced IDEA packets to track down • Reduces survey fatigue • Students like convenience of completing survey at home

  3. Concerns Going In To The Pilot • How will overall response rate be affected? • Instructors must give up control of when survey is given (students could complete survey right before or right after a test). • Potential negative impact on instructors’ ratings due to possibility that only the disgruntled students will take time to complete the survey.

  4. How we Picked Classes for Pilot Group? • Classes of 20 or more students (unless class scheduled in computer lab) • Must be a volunteer (not “army” volunteer) • All are welcome, but must go through department chair • No more than one or two classes per instructor • Any class that meets in a computer lab

  5. Maximizing the Response Rate • Followed “best practices” published by the IDEA Center – they work! • Communication from instructor to student • Communication from IDEA guy to instructors • FAQs for instructors and FAQs for students • Tracking class response rate and notifying instructor once a week • Assurance to students – surveys confidential and not seen by faculty until grades turned in

  6. Maximizing the Response Rate • Student senators (handed out flyers and FAQs during office hours, posted flyers around campus, offered ice cream and cookies outside computer lab for those who completed survey) • Student newspaper ran an article on IDEA online at beginning of surveying period • Incentives & Inducements – “class” extra credit, not “individual” extra credit • Technology – email reminders, WebCT link, and class distribution lists (but no tweets on Twitter)

  7. Response Rates • 65% response rate is desirable to get a representative sample • Average response rate when paper surveys used = 78% • Average response rate when online surveys used = 55%

  8. Results of the FA09 Online Pilot • 59 classes participated – these classes were taught “on ground” and evaluated “online” • Average response rate for classes in pilot group was 85% • Average response rate (at Ben U) for classes using paper surveys is 84% • Students’ response was overwhelmingly positive • Instructors’ response was surprisingly positive • Office assistants’ response was enthusiastically positive • Department chairs’ response was proactively positive – many moved to online surveys in SP10

  9. Student Feedback • I was more honest on online survey. • Online surveys take less time. • Online is better because you don’t have to worry about time. • Online is better because … students can truly tell how they felt about the course and the teacher. • Online is faster because I can type faster than I can write. • Online is more private – it’s perfect! • Online surveys don’t waste class time. • Online is better because it saves trees; more earth friendly. • Much easier to do online than in a class of piers [sic].

  10. Faculty Feedback • Faculty liked getting their 20 minutes back • They liked “going green” – goes well with “years of environment” theme at Ben U • They thought it was more efficient and less of a hassle than paper • About half the instructors offered extra credit of some sort and half did not offer any incentives or inducements

  11. SP10 Semester • Results of FA09 pilot were reported to deans, chairs, and faculty • Many chairs moved their entire departments to online surveying in the spring • Some chairs moved some of their department to online surveying, e.g. only tenured faculty used online surveys • Several hundred courses used IDEA online surveys university-wide (N=368) • Overall response rate was 71% for courses taught “on ground” and evaluated “online”

  12. Questions that Came Up (SP10) • Q1: Are the average response rates for upper level vs lower level classes different when online surveying is used? (Answer is on handout.) • Q2: Is there any difference in the ratings instructors receive when online surveying is used? (Answer is on handout.) • Q3: Will the response rate be negatively impacted if an instructor chooses not to give incentives to students to complete the survey outside of class? (Answer is on handout.)

  13. Paths to Follow • Start with small pilot group, if possible • Meet with instructors to explain the online process • Ensure that everyone who has a part to play knows their part • Timely communication is critical at all stages • Follow “best practices” for online surveying • Get student groups involved as much as possible, e.g. Student Senators • Create a culture where completing the online survey is “expected” of students as responsible citizens in the academic community (this will take some time)

  14. Pitfalls to Avoid • Technology – WebCT zinger • Trouble uploading link or icon to WebCT • Emails from IDEA Center got sent to students’ Junk E-mail folder (IT dept had to white list IDEA Center) • Email reminders sent to students’ work or home email got kicked back to office assistant • Emails to faculty with overall response rates sometimes got sent to Junk E-mail folder (had to change settings in Outlook) • Pay attention to when surveying window is opened and when it is closed – timing is important • Pay attention to faculty questions/concerns – find good answers and give faculty time to adjust

  15. Future Plans • In FA10 the entire university will move to IDEA online surveys. Questions

More Related