1 / 20

Eye-Tracking Technology for Questionnaire Evaluation

This study evaluates the use of eye-tracking technology to assess questionnaire visibility and the effectiveness of key elements such as routing instructions, reminder bubbles, and alpha-numeric boxes.

winnieb
Download Presentation

Eye-Tracking Technology for Questionnaire Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. What the eye doesn’t see: Evaluating a paper based questionnaire using eye-tracking technology Lyn Potaka Statistics NZ

  2. Introduction • Eye tracking technology potential tool for questionnaire evaluation • Primarily used for web development • Potentially useful for paper questionnaire development (Redline & Lankford, 2001) • Feasibility study in NZ context

  3. Eye-tracking study • Small scale study due to limited funding • NZ Census (2006) project • In collaboration with Access Testing Centre (Australia)

  4. How the technology works • Infra-red light reflecting off the eye illuminates areas of the retina important to vision • Camera captures eye movements • Can then map the points at which the eye is resting on the questionnaire through the use of a computer

  5. Key objectives • Primary objective: • To assess eye-tracking as a tool for questionnaire evaluation • Secondary objectives: • Evaluate the visibility of key elements on the form • In particular – routing instructions, reminder bubbles and alpha-numeric boxes

  6. Routing instructions • Bracketed response options with single routing instruction • Shorter line lengths • Concerns re errors of commission

  7. Reminder bubbles • Bubbles to remind respondents to mark correctly, or look for more information • Bubbles appearing outside of main navigational path

  8. Alpha-numeric boxes • Concerns that boxes would prevent respondents from seeing options appearing underneath • Two versions tested (right aligned boxes & indented boxes)

  9. Method • 16 respondent interviewed: • New Zealand residents • Split of male and female • Aged 18 – 55 years • Half hour interviews • 4 page Census questionnaire (47 questions)

  10. Findings: General observations • Respondents typically observed information presented in the banner but didn’t dwell there • Respondents spent less time looking at questions in lower right regions of form • Respondents didn’t always read all of the information presented before answering questions

  11. Findings: Routing instructions • No errors of omission observed • Some errors of commission recorded • Some respondents making errors of commission had observed the routing instruction but did not skip • Suggests respondents who do not act on routing instructions immediately will often fail to recall them • Indicated individual routing instructions at the end of each response option would be better design

  12. Findings: Reminder bubbles • Bubbles were often missed • Some bubbles were more likely to be missed than others • Characteristics of questions may have impacted (eg. position on page / complexity of question) • Indicated bubbles should be used for non-essential information

  13. Findings: Alpha-numeric boxes • Respondents sometimes failed to observe options which appeared below the alpha-numeric boxes • This occurred for both versions of the questionnaire • Respondents less likely to miss options if they were actively seeking out an answer • Indicated alpha-boxes would pose a greater risk for particular question types

  14. Example of R missing option

  15. What did we learn? • Study confirmed the importance and impact of visual design on data quality • Supported existing knowledge and research on visual design • Small numbers limited the conclusions • Not appropriate to compare formats • Further work required to identify question characteristics most likely to influence results

  16. Disadvantages • Required quite a lot of time (large amount of data to integrate and analyse) • Dependent on expertise and knowledge of technology specialists • Cost (?) • Technology had limitations (eg. data loss when respondents turned the page or leaned in too close)

  17. Advantages • Dwell times and navigational patterns helped to identify difficult questions • Provided objective measure / convincing for clients • Gave indications on ‘why’ mistakes were occurring (eg. routing errors) • Helped us to identify improvements (eg. position of routing instructions)

  18. What did we conclude? • Useful tool for the design of paper questionnaires • Individual Projects (which questions being read, which instructions being missed, etc) • Potential to expand questionnaire design knowledge generally (eg. characteristics of visual design that work best) • Provides additional information to complement other evaluation strategies

  19. What would we do differently? • Consider analysis carefully before beginning to maximise learning • Consider sample carefully (number and key characteristics required) • Allow more time

  20. Planned Research • Analysis of ONS Census forms • Using more advanced technology • Building on Stats NZ project to look at specific question characteristics that may impact on results

More Related