1 / 65

E-Survey Workshop : Guidelines I and Practice

What is Electronic Survey?. Email SurveyWeb-based Survey. The rapid development of surveys on the WWW is leading some to argue that soon internet surveys will replace traditional methods of survey data collection. Major advantages have been mentioned: cost savings, speed, limited geographical const

vian
Download Presentation

E-Survey Workshop : Guidelines I and Practice

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


    2. What is Electronic Survey? Email Survey Web-based Survey

    3. Promises and Challenges Promises Strong reach / Penetration Fast response speed Low cost Response flexibility Control of anonymity Minimized data-entered error Minimized interviewer bias

    4. Promises and Challenges Challenges Generalizability and Response Rate: response rate to internet based surveys have been declining over the past 10 years Accessibility: no national or global online directory of email addresses exist Design Concerns: use design survey response features judiciously to maximize data quality and minimize error

    5. Sources of Errors in Web-based Survey Coverage Error The result of all units in a defined population not having a known nonzero probability of being included in the sample drawn to represent the population. A mismatch between the target population and the frame population.

    6. Sources of Errors in Web-based Survey Coverage Error Universal coverage of the Web remains quite limited. Coverage error represents the biggest threat to the representativeness of sample surveys conducted via the internet. Some populations do not exhibit large coverage problems: employees of certain organizations, members of professional organizations, certain types of businesses, students at many universities and college, and groups with high levels of education

    7. Sampling error The result of surveying a sample of the population rather than the entire population. Not all members of the frame population are measured. Sources of Errors in Web-based Survey

    8. Sampling error Sources of Errors in Web-based Survey

    9. Sources of Errors in Web-based Survey Measurement error The result of inaccurate responses that stem from poor question wording, poor interviewing survey mode effects and/or some aspect of the respondent’s behavior.

    10. Sources of Errors in Web-based Survey Measurement error Measurement error is the deviation of the answers of respondents from their true values on the measure. The appearance of a survey can vary from respondent to respondent because of different browser settings, user preferences, variations in hardware, and so on. There is much work to be done to determine optimal designs for different groups of respondents and types of surveys.

    11. Sources of Errors in Web-based Survey Non-response error The result of non-response from people in the sample, who, if they had responded, would have provided different answers to the survey questions than those who did respond to the survey.

    12. Sources of Errors in Web-based Survey Non-response error Non-response error arises through the fact that not all people included in the sample are willing or able to complete the survey. Non-response error is a function of both the rates of non-response and of the differences between respondents and non-respondents on the variables of interest. There is at present little experimental literature on what works and what does not, in terms of increasing response rates to web surveys.

    13. Types of Web survey (Couper, 2001) Probability-based Web-based surveys Intercept surveys List-based samples of high coverage populations Mixed-mode designs with choice of completion method Pre-recruited panels of internet users Probability samples of full population Non-Probability-based Web-based survey Polls as entertainment Unrestricted self-selected surveys Volunteer opt-in panels

    14. Seven Response Types for Web-based Surveys (Bosnjak and Tuten, 2001) Unit non-responders Complete responders Answering drop-outs Lurkers Lurking drop-outs Item non-responders Item non-responding dropouts

    15. Seven Response Types for Web-based Surveys (Bosnjak and Tuten, 2001)

    16. Design of Web-bases Surveys (Dillman and Bowker, 2001) Introduce the web questionnaire with a welcome screen (Non-response error) Provide a PIN number (Sampling, Coverage) First question should be interesting to most respondents, easily answered, and fully visible on the first screen of the questionnaire. (Non-response) Present each question in a conventional format similar to that normally used on paper self-administered questionnaires. (Measurement, Non-response)

    17. Design of Web-bases Surveys (Dillman and Bowker, 2001) (Cont.) Restrain the use of color so that figure/ground consistency and read-ability are maintained, navigational flow is unimpeded, and measurement properties of questions are maintained. (Measurement) Avoid differences in the visual appearance of questions that result from different screen configurations, operating systems, browsers, partial screen displays and wrap-around text. (Coverage, Measurement, Non-response)

    18. Provide specific instructions on how to take each necessary computer action for responding to the questionnaire and other necessary instructions at the point where they are needed. (Non-response) Use drop-down boxes sparingly, consider the mode implication, and identify each with a “click here” instruction. (Measurement) Do not require respondents to provide an answer to each question before being allowed to answer any subsequent ones. (Non-response) Design of Web-bases Surveys (Dillman and Bowker, 2001) (Cont.)

    19. Provide skip directions in a way that encourages marking of answers and being able to click to the next applicable question. (Measurement) Construct web questionnaires so they scroll from question to question unless order effects are a major concern, and/or telephone and web survey results are being combined. (Coverage, Measurement, Non-response) When the number of answer choices exceeds the number that can be displayed in a single column on one screen, consider double-banking with an appropriate grouping device to link them together. (Measurement) Design of Web-bases Surveys (Dillman and Bowker, 2001) (Cont.)

    20. Use graphical symbols or words that convey a sense of where the respondent is in the completion process, but avoid ones that require significant increases in computer memory. (Coverage, Non-response) Exercise restraint in the use of question structures that have known measurement problems on paper questionnaires, e.g. check-all-that-apply and open-ended questions. (Measurement, Non-response) Design of Web-bases Surveys (Dillman and Bowker, 2001) (Cont.)

    21. Other Web Survey Design Suggestion: Zanutto (2001) Use a cover letter with the questionnaire Make the survey simple, and have it take no longer than 20 minutes Give an estimated time that it will take to complete the survey Be sure the first question is interesting, easy to answer, and related to the topic of the survey Be concerned about privacy issues for the respondents and the data that is collected. Allow an alternate mode of completion if people are concerned about privacy, i.e. print and mail in the survey.

    23. ????Web-survey

    24. ??????????

    25. The Advantage of Web-Survey Cost saving Ease of contacting respondents More efficient data processing Ability for customization

    26. How to design STEP1:Introduction

    27. (1)?????? ?????? ???? ???? (2)?????? ??????800*600,??????????????????? STEP2:Layout

    28. STEP3:Webpage Component

    29. ??? ??HTML??,?????????: FrontPage?Dreamwaver?,???????????,?????????????????,?????? ??? ?????????????,????????????????;???,?????????????????????????,????ASP?PHP?JAVA? STEP4:Save Result

    30. ???????????coding?????? ????????????????? (??.txt, .csv??????????) STEP5:Analyze Result

    31. STEP BY STEP: Form (??)

    32. STEP BY STEP : Form

    33. STEP BY STEP : Pixels(????)

    34. STEP BY STEP : Pixels

    35. STEP BY STEP : Table(??)

    36. STEP BY STEP : Table

    37. STEP BY STEP : Table

    38. STEP BY STEP : Radio Button

    39. STEP BY STEP : Radio Button

    40. STEP BY STEP : Radio Button

    41. STEP BY STEP : Radio Button

    42. STEP BY STEP : Check Box

    43. STEP BY STEP : Check Box

    44. STEP BY STEP : Check Box

    45. STEP BY STEP : Dropdown

    46. STEP BY STEP : Dropdown

    47. STEP BY STEP : Dropdown

    48. STEP BY STEP : Dropdown

    49. STEP BY STEP : Dropdown

    50. STEP BY STEP : Text Box

    51. STEP BY STEP : Text Box

    52. STEP BY STEP : Text Box

    53. STEP BY STEP : ??

    54. STEP BY STEP : ??

    55. STEP BY STEP : ??

    56. STEP BY STEP : ??

    57. STEP BY STEP : ??

    58. STEP BY STEP : Submit

    59. STEP BY STEP : Submit

    60. STEP BY STEP : Saving

    61. STEP BY STEP : Saving

    62. STEP BY STEP : Saving

    63. STEP BY STEP : Saving

    64. ??????

    65. Reference Bosnjak, M. M. and Tuten, T. L., 2001. “Classifying Response Behaviors in Web-Based Surveys” Journal of Comupter-Mediated Communication, Vol6, No.3, at http://www.ascusc.org/jcmc/vol6/issue3/boznjak.html. Couper, M. P., 2000. “Web Surveys a Review of Issues and Approaches” Public Opinion Quarterly, Vol64, No.4, pp.464-481. Dillman, D. A., 2000. Mail and Internet Surveys: The Tailored Design Methods. Second edition. New York: Wiley. Dillman, D. A., 2002. “Navigating the Rapids of Change: Some Observations on Survey Methodology in the early 21st century” presidential address to the American Association for Public Opinion Research. Dillman, D. A. and Bowker, D. K., 2001. “The Web Questionnaire Challenge to Survey Methodologists” at http://survey.sesrc.wsu.edu/dillman/zuma_paper_dillman_bowker.pdf.

    66. Reference Dillman, D. A., Tortora, R. D. and Bowker, D., 1998. “Principle for Constructing Web Surveys” Pullman, Washington. SESRC Technical Report 98-50, at http://survey.sesrc.wsu.edu/dillman/papers/websurveyppr.pdf. Smith, T. W., 2001. “Are Representative Internet Surveys Possible?” proceedings of Statistics Canada Symposium. Sheehan, K. B., 2002. “Online Research Methodology: Reflections and speculations” Journal of Interactive Advertising, Vol3, No.1.

More Related