1 / 34

The Effect of Incentives on Internet Surveys

The Effect of Incentives on Internet Surveys. John M. Kennedy Judith A. Ouimet Indiana University Bloomington. Why Provide Incentives. To improve survey data quality Higher response rates Fewer breakoffs Fewer missing items More precise responses To improve the respondent experience

cana
Download Presentation

The Effect of Incentives on Internet Surveys

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Effect of Incentives on Internet Surveys John M. Kennedy Judith A. Ouimet Indiana University Bloomington

  2. Why Provide Incentives • To improve survey data quality • Higher response rates • Fewer breakoffs • Fewer missing items • More precise responses • To improve the respondent experience • Respect for their time • Reward for cooperation

  3. Improving Data Quality • Most incentive research focuses on improvements in response rates • Improved response rates do not necessarily indicate better quality survey data • Better quality data is difficult to determine • Rarely are external data available to evaluate responses • Differences may be due to sampling

  4. Incentives May Not Be the Solution • Incentives may not be cost effective • Resources used for incentives might be used for other procedures that improve survey data quality • Incentives may reduce survey data quality • Encourage respondents to complete to receive incentive • Reduced attention to survey questions • Trigger spam filters

  5. Improving Respondent Experience • Declining response rates may be due to badly designed surveys which demonstrate indifference to the respondent experience • Improving respondent experience requires demonstration of respect for respondents’ time and effort • Not all incentives show respect for all participants • The research on incentives does not indicate which incentives work best under which conditions

  6. Internet Survey Incentives • More difficult to manage than postal, in-person, or telephone surveys • Often less contact information is available for Internet users • Distribution may require postal mailings • Additional costs and modes may take resources from the survey administration

  7. Types of Incentives:Internet Only • Non-contingent • Advanced cash or gift • All in sample entered into a lottery • Resources provided to those who do not participate • Generally requires postal mailing or requires additional effort of respondent to pick-up incentive

  8. Contingent Incentives • Usually easier to administer • Contact information can be gathered • Types • Lotteries, cash, gift cards • Provided to those who participate in the survey • Reduced costs

  9. Internet Survey Incentives Research • Research conducted in US and Europe • Mixed results • Not all incentives improve survey data quality • Evaluation standards differ across studies • Some evaluate response rates, item nonresponse, breakoffs. • Many examine multiple indicators of survey data quality

  10. Internet Survey Incentive Research • 30 articles and chapters • Four meta-analyses or literature reviews • Internet surveys only or main focus • Either experiments or comparisons with similar surveys • 15 conducted in the US and Canada; 11 in Europe • Students, physicians, scientists, panels, etc.

  11. More Incentives Research • Non-contingent and Contingent • Types • Cash • Gift cards • Lotteries • Focus • Response rates • Item non-response • breakoffs

  12. Response Rates Summary • Overall – incentives appear to improve response rates • Lotteries improve response rates often but not always • Non-contingent almost always improve response rates • Improvements are usually small

  13. Other Outcomes • Incentives generally reduce item non-response and breakoffs • Cash is more effective than lotteries • Incentives appear to improve response rates for the first wave of a panel only • Incentives can have differential effects on different groups • Incentives can affect survey responses

  14. Summary • Results are mixed • Differences by groups, contingency, stage of survey process, etc. • Cost effectiveness is not easily determined • Incentives are determined by the researchers, not the participants • Not necessarily respect the participants’ choice • May reduce effectiveness with the wrong incentives

  15. Student Survey Example • National Survey of Student Engagement • Survey of undergraduates in US and Canada • Started in 2000 • Generally lower response rates than desired • Incentives offered differentially • Universities offer incentives • Different incentives over time

  16. NSSE and IUB • Response rates declining over time • Higher response rates desired • Incentives introduced in 2010 • Follow-up survey to determine reasons for participation or non-participation • Asked about incentives that would make response more likely

  17. NSSE 2001-2009 Response rates & Trend line

  18. Incentives Offered 2010 • Every responder received a free soda at the IMU • Prizes: • 5 iPods • 2 lunch for 4 at the Tudor Room • 50 semester lockers & towel service • 25 CycleFit 3 session punchcards

  19. Incentive Survey Process • Surveyed students after the NSSE survey ended • Created three questionnaires with focused response options depending on whether students participated in the survey or did not. • Respondents • Non-respondents • First year students • Seniors • Most questions focused on reasons for participation or non-participation • Asked students what kinds of incentives would encourage them to participate in NSSE

  20. What contributed to your willingness to respond?

  21. Responder student-Suggested Incentives t-Suggested Incentives

  22. Non-Responder student- Suggested Incentives

  23. Closing the Loop – Incentives in 2011 • Increased the number of incentives • Increased the odds of winning • Drawing only

  24. Incentives Offered in 2011 • Apple iPad 32 GB (value $599—1 winner) • Apple IPod Touch (value $199—3 winners) • Crimson Hooded sweatshirt with white IU Trident (value $40—48 winners) • Lunch for two at the Tudor Room (value $26—8 winners) • Crimson T-shirt with white IU Trident (value $18—145 winners) • Mother Bear’s Pizza large, one-topping pizza (value $14—250 winners) • Crimson foldable 42 inch umbrella with white IU Trident (value $13—50 winners) • Cyclefit 3 session punch card (value $12—5 winners) • Circuit Training session punch card (value $12—5 winners) • TIS gift card (value $10—250 winners) • Scholar’s Inn Bakehouse gift card (value $10—125 winners) • SRSC T-shirt (value $10—10 winners) • Gift card—Target, Amazon, or Starbucks (value $10—100 winners) • Bloomington Bagel Company 5 Wooden Nickels (value $5 total—257 winners)

  25. Follow-Up Survey 2011 • Conducted after NSSE IUB field period ended • Similar to 2010 Follow-up survey • Reasons for participation or non-participation • Desired incentives

  26. WHICH AFFECTED YOUR DECISION TO COMPLETE THE SURVEY?

  27. Incentives in 2012 • Two stage incentives • All completers • Scoop of ice cream (coupon sent electronically) • Free bagel coupon (picked-up at library) • All completers • Entered into a weekly drawing to win one of 115 prizes • Earlier students complete survey, more opportunities to win

  28. Drawing Incentives in 2012

  29. 2012 Response Rate • Changes • Response rate decreased to 30% • Bus with NSSE facts • List of past winners on website (views 1489 time) • List of prizes embedded in invitation and follow-up emails (viewed 1495 times) • Speculation on why • Later start date • Competing surveys

  30. Lessons Learned • Sample members may not accurately predict the utility of incentives • Managing a complex set of incentives requires resources • Respondents may attribute more of their decision to participate to the incentives when asked later

  31. Conclusions • Incentives can improve response rates for Internet surveys • The effect of incentives on data quality and differences across groups needs much more research • More research is needed on why and how incentives affect the decision to participate

  32. Contact Information John Kennedy kennedyj@indiana.edu Judy Ouimet ouimet@indiana.edu

More Related