1 / 28

Kerry Levin & Jennifer O’Brien, Westat

Web Design Issues in a Business Establishment Panel Survey Third International Conference on Establishment Surveys (ICES-III) June 18-21, 2007 Montréal, Québec, Canada. Kerry Levin & Jennifer O’Brien, Westat. Overview of Presentation. A brief review of the web design system and its origins

prisca
Download Presentation

Kerry Levin & Jennifer O’Brien, Westat

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Web Design Issues in a Business Establishment Panel SurveyThird International Conference on Establishment Surveys (ICES-III)June 18-21, 2007Montréal, Québec, Canada Kerry Levin & Jennifer O’Brien, Westat

  2. Overview of Presentation • A brief review of the web design system and its origins • Design issues we encountered • Opportunities for experimental investigation

  3. Background • The Advanced Technology Program (ATP) at the National Institute of Standards and Technology (NIST) is a partnership between government and private industry to conduct high-risk research • Since 1990, ATP’s Economic Assessment Office (EAO) has performed rigorous and multifaceted evaluations to assess the impact of the program and estimate the returns to the taxpayer. One key feature of ATP’s evaluation program is the Business Reporting System (BRS).

  4. General Description of the BRS • Unique series of online reports that gather regular data on indicators of business progress and future economic impact of ATP projects • ATP awardees must complete four BRS reports per calendar year– three short quarterly reports and one long annual report

  5. General Description of the BRS • There are several different types of instruments (each with a profit and nonprofit version): • Baseline • Annual • Closeout • Quarterly • The BRS instruments are a hybrid survey/progress report that ask respondents attitudinal questions as well as items designed to gather information on project progress. • The Baseline, Annual, and Closeout reports are between 70 and 100 pages in length. Due to this length and complexity, web administration is the most logical data collection mode

  6. Design issues: Online logic checks vs. back-end logic checks 1. Examples of online logic checks (i.e., hard edits) • Sum checking • Range checks 2. Examples of back-end logic checks • Frequency reviews • Evaluation of outliers

  7. Online sum checking: Example

  8. Online range checking: Example

  9. Back-end checking: Frequency reviews and outlier evaluations • At the close of each cycle of data collection, the data for each instrument are carefully reviewed for anomalies • Frequency reviews are conducted to ensure that there were no errors in skips in the online instrument • Although the BRS includes range checks for certain variables, the ranges are sometimes quite large, therefore an evaluation of outliers is a regular part of our data review procedures

  10. Use of pre-filled information in the BRS The BRS instruments make use of two types of pre-filled information: • Pre-filled information from sources external to the instrument (i.e., information gathered in previous instruments or information provided by ATP such as issued patents) • Pre-filled information from sources internal to the instrument (i.e., information provided by the respondent in earlier sections of the current report)

  11. Pre-filled information: External source example

  12. Pre-filled information: Internal source example

  13. Required items • While most items in the BRS instruments are not required, the few that are fall into two categories: • Items required for accurate skips later in the instrument • Items deemed critical by ATP staff

  14. Required items: Example item important for skip pattern

  15. Required items: Example item critical to ATP

  16. Unique Design: Financial items

  17. Administration issues in the BRS: Multiple respondents • Each ATP-funded project has multiple contacts associated with it • It is rarely the case that a single respondent can answer all items in the survey. Westat provides only one access ID per report, however, therefore the respondents are responsible for managing who at their organizations are given access to the BRS online system

  18. Experimental investigations using the BRS

  19. Reducing item nonresponse: The Applicant Survey • The ATP’s Applicant Survey is not one of the BRS instruments, but is regularly administered via the web to companies and organizations that applied for ATP funding • In 2006, Westat embedded an experiment within the Applicant Survey to test which of two different types of nonresponse prompting would result in reduced item nonresponse

  20. Reducing item nonresponse: The Applicant Survey 904 respondents were randomly assigned to one of three conditions: 1) Prompt for item nonresponse appeared (if applicable) at the end of the survey; 2) Prompt for item nonresponse appeared (if applicable) after each section; 3) No prompt (control group).

  21. Reducing item nonresponse: The Applicant Survey End of Survey: After each section:

  22. Reducing item nonresponse: The Applicant Survey Both prompts for item nonresponse appeared effective, and to an equal degree.

  23. Boosting response rates: The days of the week experiment • Literature suggests that there are optimal call times for telephone surveys. But are there also optimal days of the week to email survey communications? • Optimal day to email was measured by: • The overall response rate • The time it takes to respond

  24. Boosting response rates: The days of the week experiment • Three different experimental conditions: • Monday cohort 2) Wednesday cohort 3) Friday cohort • The invitation email and up to 3 reminders were all sent on the same day, either Monday, Wednesday, or Friday.

  25. Boosting response rates: The days of the week experiment

  26. Time to Complete the Survey Cumulative Response Rates

  27. Boosting response rates: The days of the week experiment • Friday cohort trends toward higher response rates, but all cohorts require the same amount of effort to achieve their respective response rates • Overall, there is some evidence that the day of the week does matter

  28. Conclusion • The BRS has presented us with various design and administration challenges • We have had the chance to fine-tune and address a variety of issues that that have come to our attention • As researchers encounter new issues in the administration of web surveys, the BRS offers a place to study them

More Related