1 / 32

How to Design a Survey Instrument

How to Design a Survey Instrument. Stephanie L. Wehnau Nicole L. Sturges Center for Survey Research Penn State Harrisburg. The Process of Survey Research. Define research objectives. Choose mode of collection. Choose sampling frame. Construct and pretest questionnaire.

dori
Download Presentation

How to Design a Survey Instrument

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. How to Design a Survey Instrument Stephanie L. Wehnau Nicole L. Sturges Center for Survey ResearchPenn State Harrisburg

  2. The Process of Survey Research Define research objectives Choose mode of collection Choose sampling frame Construct and pretest questionnaire Design and select sample Design and implement data collection Code and edit data Make post-survey adjustments Perform analysis

  3. 1. Topic Mapping

  4. Topic Mapping Start with the research objectives List the topics to be investigated Translate topics into questions (measures)

  5. Topic Mapping Example • Questionnaire that evaluates alcohol use at Penn State Harrisburg (PSH) for Student Health Services: • To what extent are students using alcohol at PSH? • What beliefs exist about alcohol use at PSH? • What do students know about the negative affects of alcohol use? • What are the demographics of survey respondents?

  6. 2. Writing Questions

  7. Guidelines forSurvey Design Mode (paper, phone, web, face-to-face) Keep your instrument short (respondent burden) Non-threatening questions to more sensitive topics (ask demographics last) - build rapport! Ask the same question different ways (validity) Adopt or adapt Q’s (Census) PRE-TEST! (More on this later…)

  8. Types of Questions: Open-ended Questions • Respondents have the opportunity to provide an answer of their own. • May be more difficult to analyze given the extensive amount of information that can be collected. • Examples: • What do you think is the most important problem facing Pennsylvania today? • What is your age?

  9. Types of Questions: Closed-ended Questions The respondent is asked to select an answer from a list provided by the researchers. Widely used because they provide a greater uniformity of responses and are more easily analyzed. Select one response or “Select all that apply”

  10. Types of Questions: Closed-ended Questions • “Select one” Example: • In general, how satisfied are you with the way things are going in Pennsylvania today? • Very satisfied • More or less satisfied • Not satisfied at all • Don’t know • Declined to answer

  11. Types of Questions: Closed-ended Questions • “Select all that apply” Example: • What type of insurance do you have? Please select all that apply. • Employer-based health insurance • Medicare • Government-provided insurance • Military health care • Purchased health insurance • Other insurance • Don’t know • Declined to answer

  12. Types of Questions: Partially Closed-ended Questions • The respondent can select from one of the response options or can supply his/her own response in an “other” category. • Example: • Which of the following is your favorite college men’s sport? • Football • Basketball • Baseball • Other: ___________

  13. Writing Questions General Tips

  14. Avoid ambiguous words (write concrete questions) • Be specific (with definitions, time frames, locations, etc.). EXAMPLES - • Poor: • When did you move to Pennsylvania? • Right after I finished college. • When I was 20. • In 1998. • Better: • In what year did you move to Pennsylvania? • Poor: • How often do you exercise in a typical week? • Better: • How often did you exercise during the past week (start with today’s date and count back 7 days)?

  15. Define terms and Avoid abbreviations • Don’t leave it up to the respondent to interpret a term – provide a definition so that everyone is answering the same question. • Example: • A drink of alcohol is considered 1 can or bottle of beer, 1 glass of wine, 1 can or bottle of wine cooler, 1 cocktail, or 1 shot of liquor. During your current pregnancy, how many days per week have you had at least one drink of any alcoholic beverage? • Do not assume that your respondents will be familiar with abbreviations.

  16. Ask one question at a time (avoid double-barreled questions) • Poor: • Do you like apples and oranges? • Do you want to be rich and famous? • Better: • Do you like apples? • Do you like oranges? • Do you want to be rich? • Do you want to be famous?

  17. Beware of questions that elicit socially desirable responses • Respondents may give false answers because they are embarrassed or feel bad about their answer. • Examples: • Do you view sexual material on the Internet? • Have you ever smoked marijuana?

  18. Avoid pre-disposing respondents • Remain NEUTRAL! Don’t guide your respondents to an answer. • Feedback during an interview (in-person or via phone) can also bias a respondent. • Examples: • More people have seen the movie, Gone with the Wind, than any other motion picture. Have you seen this movie? • Newspapers and television started talking about patient safety and similar problems in our healthcare system about five years ago. Do you think that, in the past five years, patient safety has gotten better, stayed the same, or gotten worse?

  19. Avoid negative questions • Negative questions can be confusing! • Examples: • The United Nations should NOT have more authority to intervene in military affairs. • Our Diocese today has not developed innovative ministries for small and rural communities.

  20. 3. Writing Responses

  21. Creating Response Sets Mutually exclusive (can’t fall into 2 categories) and exhaustive (all possible) categories Appropriate number of options (too many = confusing; too few= respondent can’t answer) Appropriate range of options (look at distribution of responses; income example) Consider rotating response options because respondents tend to answer with the first or last response option they read/hear (if applicable)

  22. 4. Testing Your Instrument Four Stages of Traditional Testing

  23. Testing – Stage 1 • Developmental work • Typical aims of this stage are: • Explore new topic areas (consult professional experts and cultural insiders) • Test feasibility of methods • Focus on problem areas

  24. Testing – Stage 2 • Question testing • Typical aims of this stage are: • Test and refine the wording of individual questions • Can respondents understand questions? • Can interviewers ask the questions? • Do the questions answer what you think they should answer? • Check the flow of the questionnaire as a whole • Do the questions and sections flow well? • Look for possible context effects • Check skip patterns (if not computer-assisted) • Look for respondent burden (interest in topic, length, etc.)

  25. Testing – Stage 2a • Informal testing (before you test questions on members of your population) • Read your questionnaire aloud to yourself • Highlights the differences between written and spoken language • A question that looks great on paper can still be difficult to read aloud • Interview yourself • Can YOU answer the question? • Is the question easy to answer? • Do mock interviews with colleagues *These suggestions may seem simple, but they are very effective for quickly finding errors early in the questionnaire design process.

  26. Testing – Stage 2b • Test your questions with members of your population • Examples: • Respondent debriefing (2 part process: go through survey questions, then debriefing; ask respondents how they came up with their responses) • Cognitive interviewing (think aloud method) • Converse and Presser (1986) suggest testing questions with 25-75 individuals. • Fowler (1995) suggests 10-25, 30 individuals

  27. How many tests at Stage 2b? If time and money allow, it is a good idea to do two. In the first pre-test, you identify which questions don’t work. You then improve them (or at least think you do!) – the only way to know for sure is to re-test them. Ideally, you should not include a question in the final instrument that has not been tested.

  28. Usability testing • If the survey is a computer-assisted questionnaire (phone or web survey), then you will need to test the usability of the questionnaire. • In other words, is the programming correct? Testing – Stage 3

  29. Testing – Stage 4 Dress Rehearsal Pilots • Typical aims of the stage are: • Test the survey procedures as a whole in survey conditions • Estimate rates of response • Check timings • Develop pre-codes for open-ended questions • Smooth coordination and establish routines • Test other aspects of the survey methodology • How many respondents? • Converse and Presser (1986): n=25-75+ • U.S. Census long form test n=16,000 *This stage is not always necessary. The first four purposes could be carried out as part of Stage 2b.

  30. Considerations for Testing • The scale of your testing will depend on: • Budget • Nature of the questionnaire • Size of the population (in some cases)

  31. QUESTIONS? Stephanie L. Wehnau Assistant Director, Center for Survey Research Phone: 717-948-6429 Email: slh227@psu.edu Nicole L. Sturges Project Coordinator, Center for Survey Research Phone: 717-948-6117 Email: nls17@psu.edu

More Related