survey research
Download
Skip this Video
Download Presentation
Survey Research

Loading in 2 Seconds...

play fullscreen
1 / 49

Survey Research - PowerPoint PPT Presentation


  • 304 Views
  • Uploaded on

Survey Research Objectives: Explain why we use composite measures Differentiate scale from index Define criteria for selecting items Describe strategies for dealing with missing data Identify which scales are best used where Illustrate how surveys might be used

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Survey Research' - omer


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
survey research

Survey Research

Objectives:

Explain why we use composite measures

Differentiate scale from index

Define criteria for selecting items

Describe strategies for dealing with missing data

Identify which scales are best used where

Illustrate how surveys might be used

Outline how decisions are made about types of question structures

Explain why social desirability is a problem

List guidelines for good questionnaire formats

Demonstrate how to write types of questions and question structures

Defend why order is important in asking questions

Describe the pros and cons of methods of questionnaire distribution

Assess the strengths and weaknesses of survey design

general criteria used
General Criteria Used:
  • Significance of the Topic (sometimes subjective but author must convince you)
  • Content of the Research
  • Quality of the Presentation (writing is clear, accurate, concise, tables and figures used appropriately, correct grammar, adherence to APA)
content
Content
  • Abstract reflects the rest of paper
  • Clear purpose
  • Theoretical or conceptual framework
  • Convincing literature review
  • Methods appropriate and well explained
  • Results reflect purpose
  • Analysis explained and appropriate
  • Discussion ties back to framework and findings
  • Limitations stated
  • Future research/management implications described
assignment for march 11
Assignment for March 11
  • Article to Review (emailed already and on my website)

http://www4.ncsu.edu/~kahender

  • 2-4 pages addressing:
    • Significance
    • Content (major part)
    • Organization
    • Do you think this paper should be published? BOTTOM LINE
composite measures scales and indexes or is it indices
Composite Measures-Scales and Indexes (or is it Indices?)
  • No single indicators
  • Range of variation
  • Efficient for data analysis
index and scale
Index and Scale

Similarities:

  • Both are measures of variables.
  • Both order units of analysis in terms of specific variables.
  • Both are measurements based on more than one data item.
index and scale8
Index and Scale

Scoring Differences:

  • Index: accumulate scores assigned to individual attributes.
  • Scale: assign scores to patterns of responses. (usually better)
constructing an index of scale or any type of paper and pencil instrument
Constructing an Index of Scale (or any type of Paper and Pencil Instrument)
  • Select items
  • Consider empirical relationships-how items are related or grouped
  • Develop response content
  • Determine order and organization
  • Think about what missing data may mean
  • Pilot Test to determine VALIDITY, RELIABILITY, and USEABILITY
index
Index

See Babbie for Details

techniques of scale construction most useful
Techniques of Scale Construction (most useful!)
  • Likert scaling - uses standardized response categories.
  • Semantic differential -asks respondents to rank answers between two extremes.
typologies a little like themes or constructs
Typologies (a little like Themes or Constructs)
  • Typically nominal
  • Summarize the intersection of 2 or more variables to create a set of categories.
to be clear
To be clear…
  • RespondentPerson who provides data for analysis by responding to a survey questionnaire.
  • QuestionnaireInstrument designed to elicit information that will be useful for analysis.
  • Response rateNumber of people participating in a survey divided by the number selected in the sample.
  • InterviewA data-collection encounter in which one person (an interviewer) asks questions of another (a respondent).
slide14
To Use an Existing Instrument and Develop Your Own
    • Does an instrument exist?
    • Is it reliable and valid? And useable??? (Time and cost)
    • Is it appropriate for the participants using it?
    • Are directions clear and concise? Easy to score?
    • OR, Develop your own
    • It must be reliable, valid, useable, appropriate, clear, easy to score
locating an instrument
Locating an Instrument
  • Mental Measurement Yearbook
  • D’ literature
  • Standardized Instruments
survey techniques
Self-Administered

Mailed

Drop/off pick up

Group administered

Call ahead and mail

Internet

Survey Monkey

Web-based

Email

Interviews

Telephone

Personal

Group (Focus Group)

Survey Techniques
pros and cons of types of surveys
Mail

Internet

Telephone

Personal Interview

Group Interview

Cost per interview? (low, med, high) Administrative cost? Data- collection speed? Amt of data? Complexity of data? Detail of data? (strong, med, weak) Sample selection? (fair, good, excellent) Probable response rate? Control for researcher effects?

Pros and cons of types of surveys
family feud self administered
Advantages

Ease of presenting questions

Requires careful wording

Longer, stage questions can be used

High degree of anonymity

Disadvantages

Requires careful wording

No chance to clarify misunderstandings

Reading skills required

Not good for open-ended

Family Feud-Self Administered
mailed advantages
Mailed-Advantages
  • Low cost
  • Minimal staff/facilities required
  • Widely dispersed samples
  • Time for respondent to respond
  • Easy to administer
  • Anonymous and confidential
  • Weather not a factor
  • Good for vested interest groups
mailed disadvantages
Mailed Disadvantages
  • Low response rate possible
  • Is right person filling it out
  • No personal contact
  • Non-response bias
  • Can\'t pursue deep answers
drop off pick up advantages
Drop Off/Pick Up Advantages
  • Same as Mailed plus:
  • Can explain person to person
  • High response rate
drop off pick up disadvantages
Drop Off/Pick Up Disadvantages
  • Costly for staff time and travel
  • Access may be a problem
  • Safety of staff
group administrations advantages
Group Administrations Advantages
  • High cooperation rates
  • Low cost
  • Personal contact
group administrations disadvantages
Group Administrations Disadvantages
  • Logistics of getting people together
internet advantages
Internet Advantages
  • Fast response and data collection
  • Good and easy follow-up
  • Cost savings
  • User convenience
  • Wide geographic coverage
  • No interview bias
  • Flexible use
internet disadvantages
Internet Disadvantages
  • Population lists not available
  • Restricted to people with access
  • Possible Browser incompatibility problems
  • Respondents might be traceable
  • Possibility of multiple submissions
evaluation survey approaches
Evaluation Survey Approaches
  • What would you choose….
  • You need to gather support info from your outdoor program participants for an open hearing in 2 wks. You don’t really have any budget for the survey.
  • You need to do an assessment for your sports programs in a changing neighborhood (Latino). You have 3 months and a small budget.
  • Your youth program has focused on getting kids off the street and into your recreation center. You have a small grant to help defray the costs.
steps to develop an instrument
Steps to Develop an Instrument
  • Define the problem
  • Determine the contents from research/eval question
  • Identify the respondents
  • Develop items, structure, format
  • Write directions
  • Ensure response
administering a survey
Administering a Survey
  • Pilot-test your instrument
  • Have a good cover letter of intro
  • Mail with ways of getting a higher response
  • Do follow-ups
  • Check for non-response bias
points to ponder
Points to Ponder
  • What is the purpose of a pilot test?
  • What is the difference between a pilot test and a pre-test?
  • What is the difference between a pilot test and a field test?
  • How do we counter social desirability responses?
cover letters
Cover Letters
  • Department of Recreation and Leisure Studies
  • CB #3185 Evergreen House
  • University of North Carolina at Chapel Hill
  • Chapel Hill, NC 27599-3185
  • December 1, 2004
  • Dear (name):
  • Summer staff salaries are an issue each of us will address shortly. For some time, many camp directors have wondered how their summer camp staff salaries compared to other position in the same region and in comparable camp settings. Baseline data about summer camp staff salaries may be useful information for the camping movement in general, as well as for specific camps.
  • You camp has been randomly selected for participation in this project. We are asking only a small percentage of accredited agency, religiously affiliated, and independent day and resident camps across the United States to complete and return this short questionnaire. Therefore, your participation is extremely important.
  • You may be assured of confidentiality. The questionnaire has an identification code number for mailing purposes only. This coding is done so that we may check your camp name off the mailing list when your questionnaire is returned.
  • The results of this survey will be made available to the American Camping Association and will be distributed at the next ACA Conference in Denver. You may receive a copy of the results in mid-February if you send us a self-addressed stamped envelope with your questionnaire.
  • To get the results tabulated by February, it is necessary to have the data as soon as possible. We are asking that you please return the questionnaire by December 18. Since ACA is doing this study on a small budget, you will contribute to the organization by putting your own stamp on the return addressed envelope enclosed.
  • We appreciate your assistance. We believe we are undertaking a valuable and useful project. If you have any questions, please call us at 919-962-1222. Thank you again.
  • Sincerely,
  • (names)
  • (titles)
distributing
Distributing
  • Response Rate

(Number distributed/Number returned)

  • Incentives
  • Keeping track-assigning participant codes
  • Follow-ups
  • Checking for response bias (sampling error)
phone interviews
Phone Interviews
  • When possible, schedule the interview
  • Describe the project and how they were chosen
  • Tell them how long the interview will last
  • Let them know if the interview is being recorded
  • Proceed quickly but don’t rush them
  • Remember to read each question and answer exactly as worded
  • ***training of interviewers is Critical!!!
slide34
Open-ended questionsRespondent is asked to provide his or her own answer to the question.
  • Closed-ended questionsRespondent is asked to select an answer from among a list provided by the researcher.
slide35
BiasRefers to any property of questions that encourages respondents to answer in a particular way.
slide36
Contingency questionSurvey question intended only for some respondents, determined by their response to some other questions.
kinds of information sought
Kinds of information sought….
  • Behavior info
  • Knowledge info
  • Attitudes/beliefs/values info
  • Emotions
  • Demographic info

**pay attention to relationship of respondent to the question (their past, present, future)

potential formats
Potential Formats
  • Open-ended
  • Close-ended
    • Fixed alternatives
    • Likert Scale
    • Semantic differential
    • Ranking
  • Partially close-ended
example question structures
Example Question Structures
  • Open-ended: “What did you like best about camp?”
  • Close-ended, fixed alternatives: “What did you like best about camp this summer?”
    • Counselors
    • Evening campfires
    • Food
    • Swimming
    • Camp-outs
examples con t
Examples- con’t
  • Close-ended, Likert: “What did you think about each of these parts of camp?”

Poor Fair Good Great

Counselors 1 2 3 4

Evening Campfires 1 2 3 4

Food 1 2 3 4

Swimming 1 2 3 4

examples con t41
Examples- con’t
  • Close-ended, Semantic Differential: “What did you think about the waterfront activities?”

Good 1 2 3 4 5 Bad

Fun 1 2 3 4 5 Boring

Well-Planned 1 2 3 4 5 Disorganized

examples con t42
Examples- con’t
  • Close-ended, ranking: “What did you like best about camp this summer?” Rank the following items 1 through 5 with 1 being the best.
    • ___ Counselors
    • ___ Evening Campfires
    • ___ Food
    • ___ Swimming
    • ___ Overnights
examples con t43
Examples- con’t
  • Partially close-ended: “What did you like best about camp this summer?” (Circle the number of your answer.)
    • 1. Counselors
    • 2. Evening campfires
    • 3. Food
    • 4. Swimming
    • 5. Other __________________________________
wording advise
Wording Advise
  • One idea per question
  • Clear, brief, and simple
  • Avoid leading questions
  • Avoid estimates if possible
  • Use words familiar to respondent
  • Avoid fancy words and jargon
  • Be clear about meanings of words
  • Avoid negative questions
  • Do a pilot study
  • State alternatives precisely-mutually exclusive responses
  • Use stages if needed but give GOOD directions
what s wrong with these questions
What’s wrong with these questions?
  • Do you favor raising camper fees for trip programs but not for other camp programs? Yes No
  • Camp encourages character development. Did you attend camp as a child? Yes No
  • Currently the United Way supports 10% of the agency camp budget. Do you feel this amount should be:

a. Decreased

b. Stay the Same

c. Increased somewhat

d. Increased greatly

  • Did you use the ACA, CCI, or AEE Camp Directories when selecting the camp for your child? Yes No
what s wrong con t
What’s wrong con’t…..
  • What changes should camp make to improve the programs?
  • How many times did you go swimming this summer?
  • How did you first hear about Camp XYZ:
    • Friends b. Relatives c. At a meeting d. TV/radio e. At work
  • How many nights off did you have?
  • Why would you not want the proposed camp fees voted down by the Board?
  • Should the Y put more money into the camp or not?
format and layout design for good response
Format and Layout Design For Good Response
  • Give clear directions
  • Aesthetically pleasing
  • Start with something easy/familiar AND relevant (NOT demos though)
  • Have white space & easy font to read
  • Colored paper if easy to read—font size appropriate for audience
  • Have easy to follow directions for staged questions
  • Have “professional look”
  • Front page should include title, date or time or year, perhaps a graphic
  • Keep the length manageable for the info desired
  • Anchor all numbered responses—e.g., 5=strongly agree, 4=agree etc.
  • NO TYPOGRAPHICAL ERRORS
question order
Question Order
  • Clearly worded
  • Put questions of lesser importance later
  • Group controversial with less controversial
  • Usually general to more specific
  • Use contingency or staged questions
  • Group questions by content—can use headings
  • Use varied lettering—bold, italics, but be consistent in USE
  • If using a scale, make sure it is repeated on next page
  • Identify if you want one or MORE answers to a question-BEST or all?
instrument draft
Instrument Draft
  • Draft a 2-page questionnaire (with at least 10 questions) that you might use for something you are doing in research or other professional work
  • Use at least two question structures and two types of question content
  • Pilot test the questionnaire (with at least 2-3 people) and revise it
  • Write a one page description of your questionnaire purpose and what changes you made in the questionnaire after it was pilot tested (you might include the original as well as your revised questionnaire)
ad