slide1 n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Participant Observation PowerPoint Presentation
Download Presentation
Participant Observation

Loading in 2 Seconds...

play fullscreen
1 / 27

Participant Observation - PowerPoint PPT Presentation


  • 289 Views
  • Uploaded on

Participant Observation. A method of doing field research, or ethnography or participant observation—qualitative research Socialized into the social setting, i.e., going where the action is and simply listening, watching & jotting down notes

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Participant Observation' - adamdaniel


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
participant observation
Participant Observation
  • A method of doing field research, or ethnography or participant observation—qualitative research
  • Socialized into the social setting, i.e., going where the action is and simply listening, watching & jotting down notes
  • Researcher participates in a role in the field—makes observer comments—subjective view
  • Field observations are collected, i.e., field notes—objective view
interview schedule
Interview Schedule
  • An interview is a piece of social interaction with one person asking another a number of questions & the other person giving answers
  • i.e., qualitative interview is essentially a conversation, e.g., face-to-face interview, focus group, telephone interviews, etc
  • Types: structured (standardized) and semi-structured
  • A structured interview schedule is similar to a paper-and-pencil questionnaire—i.e., can be converted into a questionnaire—vice versa
content analysis
Content Analysis
  • Is the study of recorded human communications
  • Examples: newspapers, magazines, web pages, poems, books, songs, paintings, speeches, letters, e-mail messages, laws, constitutions, etc
  • Any technique—involves making inferences by systematically & objectively identifying special characteristics of messages, i.e., manifest & latent
  • Manifest, i.e., visible & surface content of communication—intended meaning
  • Latent, i.e., underlying meaning—unintended—require corroboration
summary
Summary
  • “Content analysis can be fruitfully employed to examine virtually any type of communication,” (Abrahamson, 1983, p.286).
  • As a consequence, it can focus on either qualitative or quantitative aspects of communication messages
reliability
RELIABILITY:
  • Is the degree to which a test consistently measures whatever it measures
  • Kirk and Miller (1986),three types:
  • (i) Quixotic, i.e., single method of observation continually yields unvarying measurement—one observer told to say the same thing--trivial—FBI stories, etc
  • (ii) Diachronic, i.e., stability of observation over time—weakness: nothing is fixed—things change
  • (iii) Synchronic: similarity of observations within same time period—most important
solution to problem of reliability
solution to problem of reliability:
  • Carefully reporting methodology used in gathering data
  • Double-coding as means of checking reliability--(Miles and Huberman,1994)
  • i.e., two or more researchers coding same field data (inter coder reliability) or
  • one researcher coding segment of data at two different periods (intra coder reliability)
calculation of reliability
Calculation of Reliability
  • Reliability= number of agreements divide by total number of agreements + disagreements
  • Most desirable range = 90%
  • Reliability is much easier to assess than validity
validity
VALIDITY:
  • Is the degree to which a test measures what it is supposed to measure
  • i.e., to confirm how plausible the data collected—
  • Kenneth Pike (1969) coined Emic and Etic concepts to explain validity in qualitative research
  • Emic: studying behavior from inside the system, i.e., local concepts, e.g., family, culture, etc
  • Etic: studying behavior from outside the system, i.e., pan-cultural concepts, e.g. circumcision of males
modifying imposed etic to achieve valid emic perspective
Modifying imposed etic to achieve valid emic perspective
  • Generating emic content of etic construct, i.e., took etic construct & interpreted the emic content, e.g., polygamy, etc., (R. W. Brislin, 1976)
  • Researcher can use triangulation, i.e., multiple methods of data collection:
  • Open-ended techniques and
  • Participant observation
reliability vs validity in quantitative research
Reliability vs. Validity in Quantitative Research:
  • Similar to qualitative because all deal with measurement
reliability1
RELIABILITY:
  • Means consistency or dependability
  • Example: a weight-scale—one gets on it & read 150 as the weight—
  • if one repeats it & gets the same weight each time then the scale is reliable
  • Focuses also on measurement, or instrumentation—
  • addressed in a variety of ways: test-retest; equivalent-forms; & split-half
test retest
Test-Retest:
  • Is the degree to which scores are consistent over time
  • Example: relationship between SAT scores 2005 & 2006,
  • i.e., administering SAT test to the same group of high school seniors at different times—
  • yielding same scores--consistently
equivalent forms
Equivalent-Forms
  • Administering two different forms of the same test, e.g., SAT test, to the same group, at the same time
  • Most acceptable estimate of reliability
  • Therefore, most commonly used in research
split half
Split-Half
  • Items on the instrument are divided into comparable halves
  • E.g., a scale divided so that the first half has the same score as the second
  • Looks at internal consistency
  • Weakness: difficulty to ensure that the two halves are equivalent
validity1
VALIDITY:
  • Measuring what you think you are measuring
content face validity
Content (Face) validity:
  • Is the degree to which a test measures an intended content area, e.g., achievement tests
  • Example: to measure knowledge of parenting skills could be obtained by consulting experts such as social workers, parents
  • Judgment is dependent upon the knowledge of the experts
criterion validity
Criterion validity:
  • Describes the extent to which a correlation exists between the measuring instrument & another standard—empirical evidence
  • E.g., the relationship between college board examination and student academic success in college
  • Two measures need to be taken: the measure of the test itself & the criterion to which the test is related
  • E.g., a program to help pregnant teenagers succeed in high school and a criterion such as SAT scores as a comparison
construct validity
Construct validity:
  • Is the degree to which a test measures an intended hypothetical construct
  • i.e., a non-observable trait, such as intelligence, which explains behavior
  • Involves testing hypothesis—deductive
  • Most difficult to establish
difference between reliability and validity
Difference between reliability and validity
  • Reliability: the degree to which a measurement procedure produces similar outcomes when it is repeated.
  • E.g., gender, birthplace, mother’s name—should be the same always—
  • Validity: tests for determining whether a measure is measuring the concept that the researcher thinks is being measured,
  • i.e., “Am I measuring what I think I am measuring”?
slide27
Note:
  • a valid test is always reliable but a reliable test is not necessarily valid
  • e.g., measure concepts--positivism instead measuring nouns—invalid
  • Reliability is much easier to assess than validity.