1 / 42

Effects of Design in Web Surveys

Effects of Design in Web Surveys Vera Toepoel Tilburg University The Netherlands CentERdata: Two Online Panels 1. CentERpanel: l Exists for 17 years 2000 households Respondents fill out questionnaires every week Online interviews as method, but:

emily
Download Presentation

Effects of Design in Web Surveys

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Effects of Design in Web Surveys Vera Toepoel Tilburg University The Netherlands

  2. CentERdata: Two Online Panels • 1. CentERpanel:l • Exists for 17 years • 2000 households • Respondents fill out questionnaires every week Online interviews as method, but: • Probability sample drawn from address sampling frame of Statistics Netherlands • Recruitment of new panel members address-based • Includes households without internet access (less than 20%): Equipment

  3. CentERdata: Two Online Panels • 2. LISS Panel • Grant from The Netherlands Organisation for Scientific Research • 5000 households • Established in 2007 (we fielded 1st questionnaire!) • Respondents fill out questionnaires every month Online interviews as method, but: • Probability sample drawn from address sampling frame of Statistics Netherlands • Contacted by letter, telephone or visit • Includes households without internet access (less than 20%): Equipment

  4. 1 item per screen

  5. 4 items per screen

  6. 10 items per screen

  7. Answer categories

  8. Open-ended

  9. Vertical: positive to negative

  10. Horizontal

  11. Numbers 1 to 5

  12. Numbers 5 to 1

  13. Numbers 2 to -2

  14. Trained Respondents: Panel conditioning • Content (knowledge on topics) • Prepare for future surveys • Develop attitudes • Procedure (question-answering process) • Learn how to interpret questions • Answer strategically • Speed through the survey

  15. Procedure (answer process) • Differences between trained and fresh respondents with regard to web survey design choices • Items per screen • Response category effects • Question layout

  16. Overall: • Difference in mean duration of the entire survey between panels: 436 seconds for the trained panel and 576 seconds for the fresh panel.

  17. Experiment 1: Items per screen • Social Desirability Scale • 10 items • 3 different formats: • 1 item per screen • 5 items per screen • 10 items per screen

  18. Experiment 1: Items per screen • Trained respondents had higher inter-item correlations for multiple-item-per-screen formats. • No significant difference in item non-response. • Mean score of the Social Desirability Scale showed no evidence for social desirability bias. • The mean duration to complete the ten social desirability items did not differ significantly between panels.

  19. Experiment 2: Answer Categories

  20. Experiment 2: Answer Categories

  21. Experiment 2: Answer Categories • Category effect found • No difference in category effect between trained and fresh respondents

  22. Experiment 3: Question Layout • Question: Overall, how would you rate the quality of education in the Netherlands? • Answer: 5-point scale • Six formats: • Reference format (decremental) • Reverse scale: incremental • Horizontal layout • Add numbers 1 to 5 to verbal labels • Add numbers 5 to 1 to verbal labels • Add numbers 2 to -2 to verbal labels

  23. Experiment 3: Question Layout • Decremental vs. incremental: T+ F • Vertical vs. horizontal layout: - • No numbers vs. numbers 1 to 5:- • Numbers 1 to 5 vs. numbers 5 to 1: T+F • Numbers 5 to 1 vs. Numbers 2 to -2: T+F • Trained respondents more easily selected one of the first options. • T=significant differences in Trained panel • F=significant differences in Fresh panel

  24. Design Effects in Web Surveys: Comparing Trained and Fresh Respondents • Overall little differences between trained and fresh respondents • Trained respondents are somewhat more sensitive to satisficing: • Shorter completion times • Higher inter-item correlations for multiple-items-per-screen formats • Select first response options more often

  25. Current and Future Research • It has been little more than a decade since systematic research was begun on visual design effects in web surveys. • In the last decade dozens of studies have been conducted • It is now important that we begin to understand the importance of each of the visual effects • Can we reduce visual effects by effective question writing?!

  26. Effective Question Writing • Tourangeau, Couper, and Conrad (POQ 2007) suggest there may be a hierarchy of features that respondents attend to: • Verbal language>numbers> visual cues • Question: Can the effects of visual layout be diminished through greater use of verbal language and numbers?

  27. Experiment 1: Visual Heuristics(joint with Don Dillman) • Tourangeau, Couper, and Conrad (POQ 2004; 2007): • Middle means typical: respondents will see the middle option as the most typical • Left and top means first: the leftmost or top option will be seen as the ‘first’ in conceptual sense • Near means related: options that are physically near each other are expected to be related conceptually • Up means good: the top option will be seen as the most desirable • Like means close: visually similar options will be seen as closer conceptually • Experimental conditions: • Polar point or fully labeled scale • With or without numbers (1 to 5)

  28. Middle Means Typical Fully labeled: even spacing Fully labeled: uneven spacing

  29. Left and Top Means First Fully labeled with color: consistent ordering Fully labeled with color: inconsistent ordering

  30. Near Means Related Polar point with numbers: separate screens Polar point with numbers: single screen

  31. Up Means Good Polar point with numbers: incremental Polar point with numbers: decremental

  32. Like Means Close Polar point Polar point with color

  33. Like Means Close Polar point with numbers (1 to 5) Polar point with different numbers (-2 to 2)

  34. Labels, numbers and visual heuristics: is there a hierarchy?

  35. Experiment 2: Pictures in web surveys (joint with Mick Couper) • Replicate study Couper, Tourangeau, and Kenyon (POQ 2004) • 1. No Picture • 2. Low frequency picture • 3. High frequency picture • Add verbal instructions • A. No verbal instruction • B. Instruction to include both high and low frequency instances • C. Instruction to include only low frequency instances

  36. Low and High frequency picture

  37. Can verbal instructions reduce the effects of pictures? • MANOVA • main effect instructions lambda=.597, p<.0001  • main effect pictures lambda=.964, p<.0001 • interaction instructions*pictures lambda=.9691,p<.0001 • This suggests that while both the main effect and interaction are significant, instructions explain more of the variation in the answers than pictures!

  38. Future Research • How to reduce visual design effects in web surveys

  39. LISS data • Every researcher (irrespective of nationality) who wants to collect data for scientific, policy or societal relevant research can collect data via the LISS panel at no cost • Proposals can be submitted through www.lissdata.nl • Existing data free available for academic use • longitudinal core studies • proposed studies • disseminated through www.lissdata.nl

More Related