1 / 26

Survey Participation: A Study of Student Experiences and Response Tendencies

Survey Participation: A Study of Student Experiences and Response Tendencies. Allison M. Ohme, IR Analyst Heather Kelly Isaacs, Assistant Director Dale W. Trusheim, Associate Director Office of Institutional Research & Planning University of Delaware June 1, 2005 AIR 2005 ~ San Diego, CA.

mickey
Download Presentation

Survey Participation: A Study of Student Experiences and Response Tendencies

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Survey Participation:A Study of Student Experiences and Response Tendencies Allison M. Ohme, IR Analyst Heather Kelly Isaacs, Assistant Director Dale W. Trusheim, Associate Director Office of Institutional Research & Planning University of Delaware June 1, 2005 AIR 2005 ~ San Diego, CA

  2. Background • University of Delaware • Fall 2004 Enrollment • Undergraduate: 16,548 • Graduate: 3,395 • Professional & Continuing Studies: 1,295 • TOTAL: 21,238 • Doctoral/Research – Extensive

  3. Background (cont.) High ability student body – has been increasing over the past 5 years.

  4. IR typically surveys undergraduates each spring. Alternate between ACT Student Opinion, NSSE, or a homegrown survey. Examples of response rates: Student Opinion 1995: 30% 1998: 26% 2002: 21% Career Plans 1996: 46% 1999: 43% 2001: 37% Our Past Surveys

  5. Declining Response Rates… Incentives? Paper v. web survey? Timing of administration? Develop a systematic study to examine these issues and their relation to poor response rates A Survey about Surveys???

  6. Then use this information to improve student response rates of future surveys. Use focus groups and telephone interviews to discover: How many survey requests typically impact an undergraduate? What factors make students likely to respond (or not respond) to a survey? Research Objectives

  7. Methodology – Survey Questions (see Appendix A) • Thinking back to the previous full academic year (2002-2003), how many surveys from any sources were you asked to complete at the University? What was the source of the survey(s)? • How many surveys did you complete and return? • What were the reasons that helped you decide to complete and return the survey(s)?

  8. Methodology – Survey Questions (cont.) • What were the reasons that made you decide not to complete and return a survey? • How do you feel when you receive an unsolicited survey? What kind of impact do they have on you? • What suggestions do you have for increasing student response rates at UD?

  9. Methodology – Initial Research Design • Random sample of: • Full-time undergraduate students • Continuing from previous academic year (2002-2003) • Contact students via telephone and ask the screening question: Have you received at least one unsolicited survey from the University in the past academic year? • If “yes”, student was invited to participate in one of five focus groups (filling ten students/group).

  10. Methodology – Initial Research Design (cont.) • If unable to attend a focus group, the student was given the opportunity to answer the same research questions as part of our telephone survey group. • Once 50 students answered the telephone survey, this portion of the methodology was closed. • Incentive: two drawings for $100 gift certificates to use in downtown Newark.

  11. Methodology – Adjusting the Research Design After only slight success in filling the focus groups: • Opened the study to students answering “no” to the screening question. • Drew additional sample of students who had been sent an Economic Impact Survey in Fall 2003.

  12. Methodology – Need for an Additional Method • Low focus group attendance (even after confirmations with the participants) yielded 8 students over three groups. • Added third method: in-person interviews of students in the UD Student Center’s Food Court. • Students answered the same questions, and were given a $5 coupon redeemable in campus Food Courts.

  13. Total Sample Total Sample over 3 methods (n=108) Focus Group Sample (n=8) Telephone Interview Sample (n=50) In-Person Interview Sample (n=50) See complete demographic breakdown in Appendix B.

  14. Findings • In academic Year 2002-03: • 26% of respondents did not receive any unsolicited surveys in 2002-03. • 48% received 2 or more surveys. • Survey sources: • Academic departments, Honors Program, Dining Services, graduate students, etc.

  15. Findings – (cont.) • How many surveys did students complete and return? • 66% of the 80 students who received surveys completed/returned all surveys. • 24% completed/returned some of the surveys. • 10% did not complete/return any of the surveys. ~ Remember these are the reported response rates of students who volunteered to participate in this study. It is no surprise that they are higher than typical survey response rates.

  16. Findings – (cont.) • Reasons for completing and returning surveys: • Desire to help UD. • Survey related to students’ interest(s), or results could affect their personal experience. • Students completed both email and paper surveys when they had “free time” and the survey required minimal effort. • When approached in-person, students find it difficult to refuse, especially when receiving an instant incentive.

  17. T-Shirt Free Meal Schoolbooks & supplies Candy Money Coupon to receive any of the above Any incentive students can accept immediately Findings – (cont.) Desirable incentives:

  18. Findings – (cont.) • Reasons for not completing and returning surveys: • Survey not of interest to the student. • Annoyed by receiving so many and/or multiple survey requests. • Survey seemed too complicated or required too much time/effort to complete. • Impact on Students? • Most students understand surveys are a normal procedure of any university or organization. • However, students are frustrated after not seeing any changes or receiving any follow-up after completing past surveys.

  19. Findings – (cont.) • Suggestions for increasing response rates: • Use incentives mentioned above. • Tailor survey descriptions with explicit impact statements. • Offer follow-up to announce results and impact. • Keep surveys short and requiring little effort to understand and complete. • Best time to survey = mid-semester. ~ Survey method preference (email, paper, in-person) varies by student.

  20. Challenges in Practice • Survey administration is decentralized across campus. • Using multiple methods (paper/web based) for one study requires additional coordination. • Students already feeling “over-surveyed”. • High preponderance of SPAM in students’ UD inboxes.

  21. Improving Response Rates • Entering Student Needs Assessment • 2001= 21% • 2003= 15% • 2004 ACT Survey= 69% 69% response rate – How did we do it?

  22. Another Example… • Career Plans Survey • 2002= 48% • Random sample of 25% of baccalaureate recipients • 2003= 41% • Random sample of 50% of baccalaureate recipients • 2004= 50% • Sampled entire class of baccalaureate recipients

  23. Questions or Comments?

  24. Thank you! Allison M. Ohme aohme@udel.edu Heather Kelly Isaacs hkelly@udel.edu Dale W. Trusheim trusheim@udel.edu www.udel.edu/IR

More Related