slide1
Download
Skip this Video
Download Presentation
The Causes of Survey Mode Differences and Their Implications for Survey Design

Loading in 2 Seconds...

play fullscreen
1 / 53

The Causes of Survey Mode Differences and Their Implications for Survey Design - PowerPoint PPT Presentation


  • 391 Views
  • Uploaded on

The Causes of Survey Mode Differences and Their Implications for Survey Design by Don A. Dillman Thomas S. Foley Distinguished Professor of Government & Public Policy and Deputy Director of the Social & Economic Sciences Research Center Washington State University

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'The Causes of Survey Mode Differences and Their Implications for Survey Design' - libitha


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
slide1
The Causes of Survey Mode Differences

and Their Implications for Survey Design

by

Don A. Dillman

Thomas S. Foley Distinguished Professor of Government & Public Policy and

Deputy Director of the Social & Economic Sciences Research Center

Washington State University

Pullman, Washington 99164-4014

http://survey.sesrc.wsu.edu/dillman/

A Presentation made to The Gallup Organisation Europe in Brussels, Belgium,

February 24, 2003

©Don A. Dillman

slide2
Why Discuss Surveys That Collect Data Using More Than One Mode (e.g., face-to-face plus telephone, or the Internet)?
  • High costs and response rate concerns have forced many survey sponsors to reconsider traditional face-to-face (and telephone) one-mode surveys.
  • We now have at least five survey modes capable of collecting quality data.
  • Combining new and old modes may improve response rates and lower costs.

©Don A. Dillman, The Social & Economic Sciences Research Center February 2003

slide3
Objective

To answer three questions:

  • Do different survey modes produce different answers to survey questions?
  • What are the reasons for those differences?
  • What might be done to reduce differences across modes?

©Don A. Dillman, The Social & Economic Sciences Research Center February 2003

slide4
More Survey Modes are Available For Use Than in the Past
  • Telephone
  • Face-to-Face Interviews
  • Touchtone Data Entry or Interactive Voice Response (IVR)
  • The Internet
  • Mail

(See Dillman, D. A. 2002 “Navigating the Rapids of Change.” Public Opinion Quarterly p. 66(3): 473-494.)

©Don A. Dillman, The Social & Economic Sciences Research Center February 2003

slide5
Mode Differences That Affect Respondent Answers

Conclusion: Mail and telephone are most different; others may be varied.

©Don A. Dillman, The Social & Economic Sciences Research Center February 2003

slide6
Past Research on Mode Differences
  • Many differences have been observed in past research, including:

Social desirability

Acquiescence

Primacy/Recency

Extremeness on scales

  • Our research knowledge about some modes is better than for others.

©Don A. Dillman, The Social & Economic Sciences Research Center February 2003

slide7
Interview Presence May Have a Normative

Influence: Social Desirability

How would you rate your health?

60

50

50

50

43

Mail

42

38

Telephone

40

30

Face-to-face

Percent

30

18

20

10

10

10

2

2

1

0

Excellent

Good

Fair

Poor

(Hochstim, 1967)

Respondents more likely to give culturally acceptable answer on telephone or in-person.

©Don A. Dillman, The Social & Economic Sciences Research Center February 2003

slide8
Interview Presence May Have a Normative

Influence: Social Desirability

Results from self-administered questionnaire followed by face-to-face re-interview one month later (Biemer, 1997).

Percent

©Don A. Dillman, The Social & Economic Sciences Research Center February 2003

slide9
Interview Presence May Have a Normative

Influence: Social Desirability

How often do you drive a car after drinking alcoholic beverages?

Percent

Frequently

Occasionally

Seldom

Never

Don’t Know

(2 = 14.8 p < .001)

Respondents are more likely to give culturally acceptable answer.

©Don A. Dillman, The Social & Economic Sciences Research Center February 2003

slide10
An Effect of Interviewer Presence:

Acquiescence, the Tendency to Agree

Percent of respondents who agree with each of 9 opinions

on how to encourage seat belt use

Percent

(Dillman & Tarnai, 1991, pp. 73-93)

©Don A. Dillman, The Social & Economic Sciences Research Center February 2003

slide11
An Effect of Locus of Control: Primacy vs. Recency

Student Survey: To what extent do students benefit from cheating?

Percent

(2 6.9df = 3 p  .07)

Primacy in mail (choosing first answer categories) vs. recency in telephone (choosing last categories).

©Don A. Dillman, The Social & Economic Sciences Research Center February 2003

slide12
Effects of Locus of Control: Primacy vs. Recency

Student Survey: To what extent do students benefit from cheating?

Order Left to Right

Order Reversed

Percent

(2 6.9df = 3 p  .07)

(2 6.49df = 3 p  .10)

Primacy in mail (choosing first answer categories) vs. recency in telephone (choosing last categories)

©Don A. Dillman, The Social & Economic Sciences Research Center February 2003

slide13
59 Primacy/Recency Comparisons from

12 Studies*

*Dillman et al., 1995. Rural Sociology 60(4): 674-687

Results from these experiments show much inconsistency.

©Don A. Dillman, The Social & Economic Sciences Research Center February 2003

slide14
23 Explicit Mail/Telephone Comparisons from Three Studies*

*Dillman et al., 1995. Rural Sociology 60(4): 674-687

Results from the direct comparisons show much inconsistency.

©Don A. Dillman, The Social & Economic Sciences Research Center February 2003

slide15
Primacy vs. Recency: Conclusions

Considerable literature has suggested that primacy in mail and recency in telephone may exist because of cognitive processing patterns. There are clear cases of effects in the literature, as well as exceptions.

I suspect that the issue is much more complicated than cognitive word processing; social desirability and visual effects may have simultaneous and competing influences.

©Don A. Dillman, The Social & Economic Sciences Research Center February 2003

slide16
An Effect of Aural Communication: Greater Extremeness on Four-Point Scales in General Public Survey

Percent choosing “not a problem” (vs. small, moderate, or serious)

Percent

Police

Protection

Quality

of Schools

Low-income

Housing

Resident

Concerns

Rec/Park

Facilities

Medical

Services

Streets/Roads/

Sidewalks

Street

Lighting

Houses/Apts.

to Rent

Mean for 9 Questions: Telephone, 47.1 Mail, 31.9 Difference, 15.2

(Dillman & Mason, 1984)

©Don A. Dillman, The Social & Economic Sciences Research Center February 2003

slide17
An Effect of Aural Communication: Greater Extremeness on Four-Point Scales in Student Survey

Issue is NOT a problem.

Percent

(Tarnai & Dillman, 1986)

©Don A. Dillman, The Social & Economic Sciences Research Center February 2003

slide18
Can Extremeness Be Reduced By Asking Respondents to View Show Cards When Responding?

Perhaps.

Student experiment with three treatment groups:

1. Visual: Self-administration

2. Aural: Telephone

3. Aural plus Visual: Telephone interview conducted with respondent while viewing self-administered version.

©Don A. Dillman, The Social & Economic Sciences Research Center February 2003

slide19
Effects on Extremeness of Three Kinds of Administration

Percent who chose “not a problem” (vs. small, moderate, serious).

Percent

Combined administration (visual + aural) may moderate extremeness.

(Tarnai & Dillman, 1990)

©Don A. Dillman, The Social & Economic Sciences Research Center February 2003

slide20
A Study of Aural vs. Visual Communication:

Telephone, IVR, Mail, and Web

A Gallup study compared telephone and IVR (aural modes) and mail and web (visual modes). Questions were comparable across modes. (Dillman, Tortora, Phelps, Swift, Kohrell and Berck, 2000)

Q2. Overall, how satisfied are you with your long distance company?

 1 Not at all satisfied

 2

 3

 4

 5 Extremely satisfied

Q3. Considering what you get and what you pay for it, how would you rate the overall value of your long distance company’s products and services?

 1 Terrible

 2

 3

 4

 5 Outstanding

©Don A. Dillman, The Social & Economic Sciences Research Center February 2003

slide21
Effects of Aural vs. Visual Communication in Gallup Experiment.

Percent choosing positive labeled end-point

Telephone

IVR

60

Mail

50

50

43

Web

Percent

39

39

40

36

33

29

29

30

27

26

21

21

22

21

19

20

18

18

16

11

9

10

0

T

I

M

W

T

I

M

W

T

I

M

W

T

I

M

W

T

I

M

W

Q2

Q3

Q4

Q5

Q6

©Don A. Dillman, The Social & Economic Sciences Research Center February 2003

slide22
Summary of Major Observed Differences Among Survey Modes

Social Desirability Bias: Telephone and face-to-face (cultural influence).

Acquiescence Bias: Tendency to agree on telephone and face-to-face (cultural influence).

Primacy Bias: Mail and web (cognitive influence).

Recency Bias: Telephone (cognitive influence).

Extremeness on Scale Bias: Telephone and face-to-face (aural vs. visual communication).

©Don A. Dillman, The Social & Economic Sciences Research Center February 2003

slide23
Construction as a Source of Mode Differences

Survey designers tend to construct items differently for different survey modes.

Telephone Favors:

 Shorter scales.

 Fewer words (e.g., numbers with endpoint labels).

 Dividing questions into multiple steps.

 Accepting “no opinion/don’t know/refuse” only if volunteered.

 Complex branching formats using simpler questions.

 Open-ended formats with follow-up probes.

 Yes/No questions instead of check-all-that-apply formats.

 Keeping questionnaires short.

©Don A. Dillman, The Social & Economic Sciences Research Center February 2003

slide24
Construction as a Source of Mode Differences

(continued)

Face-to-Face Favors:

 Use of show cards for complex/lengthy questions and to maintain respondent interest.

 Unlimited scale length or format (with show cards).

 Longer questionnaires are feasible.

IVR Favors:

 Even briefer wording formats than telephone.

 Even shorter scales than telephone.

 Even shorter questionnaires than telephone.

©Don A. Dillman, The Social & Economic Sciences Research Center February 2003

slide25
Construction as a Source of Mode Differences

(continued)

Mail Favors:

 Avoiding branching formats.

 Use of longer scales with words, numbers, symbols, and graphical information to aid comprehension.

 Avoiding open-ended formats or asking them as a sequence of shorter questions.

 Check-all-that-apply instead of Yes/No questions.

©Don A. Dillman, The Social & Economic Sciences Research Center February 2003

slide26
Construction as a Source of Mode Differences

(continued)

Internet Favors:

 Check-all-that-apply formats (html box feature).

 Use of longer scales with words, numbers, symbols, and graphical information to aid comprehension.

 Addition of audio, video, color, and other dynamic variations.

 Keeping questionnaires short.

©Don A. Dillman, The Social & Economic Sciences Research Center February 2003

slide27
Unintentional Construction Effect: Open vs. Closed Format

Telephone: What is your marital status? (Open-ended, coded by interviewer.)

Web: What is your marital status? (Click the category that applies. If you inadvertently click the wrong category, you can correct the error by simply clicking on the correct category.)

Single

Married

Separated

Divorced

Widowed

©Don A. Dillman, The Social & Economic Sciences Research Center February 2003

slide28
Unintentional Construction Difference: Results from a Telephone (Open-Ended) and Web (Close-Ended) Comparison

What is your marital status?

Differences are significant and tracked across additional panels. (Tortora, 2001)

©Don A. Dillman, The Social & Economic Sciences Research Center February 2003

slide29
Unintentional Construction Effect: Volunteered Responses on Telephone

To what extent do you favor or oppose expansion of NATO?

Telephone: Mail:

1 Strongly favor 1 Strongly favor

2 Somewhat favor 2 Somewhat favor

3 Somewhat oppose 3 Somewhat oppose

4 Strongly oppose 4 Strongly oppose

5 NO OPINION

6 DON’T KNOW

7 REFUSE

©Don A. Dillman, The Social & Economic Sciences Research Center February 2003

slide30
Examples of Unintentional Construction Difference: Check-All-That-Apply vs. Response to Each Item
  • Mail, Web: Please check which of these appliances you have in your home.
  • Telephone: Which of the following electric appliances do you have in your home? Do you have … (read each item and obtain answer).
  • Mail/Web: Telephone:
  • Toaster Toaster Yes No
  • Broiler oven Broiler oven Yes No
  • Food blender Food blender Yes No
  • Meat slicer Meat slicer Yes No

(other items) (other items)

The threat to response in check-all-that-apply questions is “satisficing” i.e., answering only until the respondent feels s/he has given a satisfactory answer.

©Don A. Dillman, The Social & Economic Sciences Research Center February 2003

slide31
Unintentional Construction Difference: Breaking Questions Into Parts on Telephone or IVR

Step 1: Are you …

Satisfied

Dissatisfied

Neither Satisfied or Dissatisfied

Step 2:

If Satisfied – Would that be Completely Satisfied, Mostly Satisfied, or Somewhat Satisfied?

If Dissatisfied – Would that be Completely Dissatisfied, Mostly Dissatisfied, or Somewhat Dissatisfied?

If Neither – Would you tend towards being Slightly Satisfied, Slightly Dissatisfied, or Completely Neutral?

©Don A. Dillman, The Social & Economic Sciences Research Center February 2003

slide32
Conclusion on Construction Effects

One of the most common reasons mode differences appear in surveys is because questions are constructed differently across modes.

First step for reducing mode differences is to reduce construction differences.

©Don A. Dillman, The Social & Economic Sciences Research Center February 2003

slide33
Differences in Visual Layouts May Also Affect Answers
  • Increasing evidence exists that written questions are communicated through multiple languages – words, symbols, numbers, and graphics.
  • Graphics is the conduit through which other languages are communicated and consists of such factors as figure-ground composition, location and spacing, size changes, brightness variations, and changes in similarity and regularity.
  • See paper to be presented at 2003 American Association for Public Opinion (AAPOR) meetings by Don A. Dillman and Leah Christian, “How Words, Symbols, Numbers, and Graphics Affect Respondent Answers to Self-Administered Surveys: Results from 18 Experiments.” at http://www.sesrc.wsu.edu/dillman/.

©Don A. Dillman, The Social & Economic Sciences Research Center February 2003

slide34
Four Examples of How Changes in Visual Layout Changed Respondent Answers

Vertical vs. horizontal layout of scales.

Change in size of open-end answer space.

Check-all-that-apply vs. Yes/No

Answer box vs. scalar layout.

©Don A. Dillman, The Social & Economic Sciences Research Center February 2003

slide35
Is Response to Ordinal Scale Influenced by Linear vs. Nonlinear Visual Format? (Dillman and Christian,2002)

Nonlinear

Linear

©Don A. Dillman, The Social & Economic Sciences Research Center February 2003

slide36
Result: Uneven Number (Five) of Categories With Scale Running Horizontally, by Row

Mean 2.4 vs. 2.4

Chi square = 10.8, p<.05

*Categories where

differences most likely to occur

©Don A. Dillman, The Social & Economic Sciences Research Center February 2003

slide37
Does Amount of Space Influence Answers to Open-ended Questions? Dillman and Christian,2002)

Expected answers could exceed allocated space

©Don A. Dillman, The Social & Economic Sciences Research Center February 2003

slide38
Number of Words in Response to Three Questions

©Don A. Dillman, The Social & Economic Sciences Research Center February 2003

slide39
Number of Themes Mentioned in Each Answer

©Don A. Dillman, The Social & Economic Sciences Research Center February 2003

slide40
Does a Change From Check-All-That-Apply to Yes/No Format Influence Choices? (Dillman and Christian, 2002)

Form A

Form B

©Don A. Dillman, The Social & Economic Sciences Research Center February 2003

slide41
Rationale for Considering Change from Check-All-That-Apply to Yes/No Format
  • Reduce tendency to satisfice.
  • Develop same format for self-administered questionnaires that is already used in telephone surveys.
  • Drawback: respondents may only check the “Yes” choice and leave others blank. Does a blank answer mean “No”?

©Don A. Dillman, The Social & Economic Sciences Research Center February 2003

slide42
Result: Which Yes/No check-all-that-apply formats for Cougar varsity sports considers self to be a fan of?

Men

Baseball

Women

Bsktball

Men

Bsktball

Women

Cross-C.

Men

Cross-C.

Football

Women

Golf

Men

Golf

Women

Rowing

Women

Soccer

Women

Swim.

Women

Tennis

Women

Track

Men

Track

Women

V-ball

Don A. Dillman, SESRC February 2003

slide43
Conclusion: Effect of Changing from Check-All-That-Apply to Yes/No Format
  • With Yes/No format all sports except football received 4 to 14% more mentions.
  • 11% of respondents checked only Yes answers
  • The number of blank answers on Yes/No format ranged from 9-13% (excluding football, 3%); People responded as if it were check-all-that-apply, rather than being less likely to choose Y/N.
  • The Yes/No format appears to reduce satisficing.

©Don A. Dillman, The Social & Economic Sciences Research Center February 2003

slide44
Does a Number Box Produce Same Responses as a Five-Point Scale with Labeled Endpoints? (Dillman and Christian, 2002)

Labeled

Endpoints

Number

Box

©Don A. Dillman, The Social & Economic Sciences Research Center February 2003

slide45
Result: On a scale of 1 to 5, where 1 means “Very Satisfied” and 5 means “Very Dissatisfied”, how satisfied are you with the classes you are taking this semester?

Mean 2.4 vs. 2.8

t=7.7, p<.001

©Don A. Dillman, The Social & Economic Sciences Research Center February 2003

slide46
Conclusion: Three Tests of Number Box vs. Polar-Point Scale
  • On all three questions, respondents provided significantly less favorable answers when the number box was used.
  • May have resulted from some respondents reversing the scales. On the number box version, there were 10% (vs. 1% for polar-point scales) of respondent scratched-out answers to at least one of these, and provided different answers.

©Don A. Dillman, The Social & Economic Sciences Research Center February 2003

slide47
Evidence of Confusion Among Number Box Respondents
  • 74 (14%) of respondents changed 86 answers.
  • Two-thirds (52) of them made these 59 reversals:

changed 4 to 2.…………….. 44

changed 2 to 4.………………. 4

changed 5 to 1……………… 10

changed 1 to 5……………….. 1

  • These data suggest that direction of scale was mentally reversed by a significant number of respondents.
  • Others probably didn’t realize error had been made because correlations of number box answers with 13 other satisfaction items were consistently lower than polar-point scales, (36 comparisons lower vs. 3 higher).

©Don A. Dillman, The Social & Economic Sciences Research Center February 2003

slide48
Additional Conclusions: Number Box vs. Polar-Point Scale
  • Some respondents may have “expected” more positive answers (e.g., Very Satisfied) to be assigned higher numbers.
  • Use of graphical and symbolic language to support verbal language eliminates potential confusion.

©Don A. Dillman, The Social & Economic Sciences Research Center February 2003

slide49
How Do We Minimize Mode Differences Across Surveys? Control of Question Formats

1. Keep wording the same (all modes).

2. Keep questions simple (fewer response choices) and use less abstract response choices (e.g., polar-point labeled scales).

3. Avoid using “hidden” response categories (telephone and face-to-face).

4. Keep visual layout the same (mail and web).

5. Do not use check-all-that-apply questions (mail and web).

©Don A. Dillman, The Social & Economic Sciences Research Center February 2003

slide50
Additional Suggestions for Minimizing Mode Differences in Mixed-Mode Surveys?

1. Limit a study to similar modes, e.g., telephone and face-to-face, or for another survey situations use only mail and Internet. Be careful about mixing visual with aural communication.

2. Adjust procedures for a particular mode (e.g., don’t use show cards for face-to-face interviews in order to get telephone equivalence).

3. For visual modes (web and mail), maintain the same language composition (words, symbols, numbers, and graphics).

©Don A. Dillman, The Social & Economic Sciences Research Center February 2003

slide51
Additional Suggestions for Minimizing Mode Differences in Mixed-Mode Surveys

4. Some differences probably cannot be eliminated, and adjustment of data may be the only solution.

e.g., Social desirability

Acquiescence

5. Some issues might be handled by randomly alternating the order of response categories and/or questions.

e.g., primacy/recency

©Don A. Dillman, The Social & Economic Sciences Research Center February 2003

slide52
The Future

1. We are just beginning to do multiple or mixed-mode surveys on a large scale.

2. Mixing of modes is important for improving response and lowering costs for some studies.

3. Significant progress is being made on understanding how visual layout influences respondent.

©Don A. Dillman, The Social & Economic Sciences Research Center February 2003

slide53
The Future
  • Significant interaction effects among causes of mode differences are likely, but have not yet been carefully evaluated.
    • Cultural influences (social desirability and acquiescence).
    • Cognitive processing in interviewed-controlled vs. respondent-controlled situations.
    • Visual vs. aural effects.
    • Effects of different visual languages.

5. Mixing modes in surveys requires careful planning and testing, but it can and needs to be done.

©Don A. Dillman, The Social & Economic Sciences Research Center February 2003

ad