ecological validity n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Ecological validity PowerPoint Presentation
Download Presentation
Ecological validity

Loading in 2 Seconds...

play fullscreen
1 / 57

Ecological validity - PowerPoint PPT Presentation


  • 237 Views
  • Uploaded on

Ecological validity. Strategy: forming a research approach The validity of research findings:. Internal Validity External Validity “Ecological” Validity. Basic experimental design, control, and context . Group Assignment. Baseline (pre-test). Experimental condition.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Ecological validity' - donny


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
ecological validity
Ecological validity
  • Strategy: forming a research approach
  • The validity of research findings:
  • Internal Validity
  • External Validity
  • “Ecological” Validity

Experimental Design & sampling

slide2

Basic experimental design, control, and context

Group Assignment

Baseline (pre-test)

Experimental condition

Follow-up (post-test)

Group 1

Observe1

Treatment 1

Observe2

Observe2

Group 2

Observe1

Treatment 2

Observe1

Control

Group 3

Observe2

2/11/13

Experimental Design & sampling

the basics of r esearch design
The Basics of Research Design

2/15/10

  • Strategy: forming a research approach
  • The validity of research findings:
  • Internal Validity
  • External Validity
  • “Ecological” Validity

Experimental Design & sampling

the basics of r esearch design1
The Basics of Research Design

2/15/10

  • Strategy: forming a research approach
  • Basic research designs
  • The validity of research findings

Experimental Design & sampling

the research flow
The Research flow
  • What is being studied?
  • What is the contrast space?
    • What is compared to what?
    • What needs explaining / what is “given”

Why?

E X A M P L E

Why do college students have riskier sex after drinking alcohol?

What is the contrast space?

What are you really asking?

  • College students versus non-students their age?
  • College students versus high school students?
  • College students adults in the community?

Experimental Design & sampling

research flow contrast space
Research flow: Contrast space
  • What is being studied?
  • What is the contrast space?
    • What is compared to what?
    • What needs explaining / what is “given”

Why?

E X A M P L E

Why do college students have riskier sex after drinking alcohol?

What is the contrast space?

What are you really asking?

  • Risky sex versus safe sex?
  • Risky sex versus no sex?
  • Risky sex versus some other risk, such as driving…

Experimental Design & sampling

research flow contrast space 2
Research flow: Contrast space, 2
  • What is being studied?
  • What is the contrast space?
    • What is compared to what?
    • What needs explaining / what is “given”

Why?

E X A M P L E

Why do college students have riskier sex after drinking alcohol?

What is the contrast space?

What are you really asking?

  • Drinking alcohol versus not drinking?

The choice of contrast space defines your Independent Variable

  • Alcohol v. drugs
  • A lot of alcohol v. a low / moderate amount…

Experimental Design & sampling

research flow theory
Research flow: Theory
  • What is known about the core hypothetical constructs?
    • How do you propose they relate to each other?
  • How will the study expand or clarify theory?
    • Use existing theory to explain a new phenomenon (“Divergent” use of theory)?
    • Test contrasting theories of one phenomenon (“Convergent use of theory)?
    • New / expanded theory?

Experimental Design & sampling

research flow theory1
Research flow: Theory

E X A M P L E

Why do college students have riskier sex after drinking alcohol?

  • What is known about the core hypothetical constructs?
    • How do you propose they relate to each other?

College environment

Relaxed norms, impulsiveness

Sexual risk

Disinhibiting effect of alcohol

Experimental Design & sampling

research flow theory2
Research flow: Theory
  • What is known about the core hypothetical constructs?
    • How do you propose they relate to each other?

This variable is common to everyone – it is a Constant.

We will not test it, but it is a core part of our theory

E X A M P L E

Why do college students have riskier sex after drinking alcohol?

College environment

Relaxed norms, impulsiveness

Sexual risk

Disinhibiting effect of alcohol

Experimental Design & sampling

research flow theory3
Research flow: Theory
  • What is known about the core hypothetical constructs?
    • How do you propose they relate to each other?

This is the variable we are interested in.

Is there really a biological disinhibiting effect, or do people just expect there to be?

E X A M P L E

Why do college students have riskier sex after drinking alcohol?

College environment

Relaxed norms, impulsiveness

Sexual risk

Disinhibiting effect of alcohol

This is our Independent Variable.

Experimental Design & sampling

research flow theory4
Research flow: Theory
  • This is a mediatingvariable
  • We hypothesize that…
  • environment + alcohol  relaxed norms & impulsivity
  • which then leads to risk.
  • What is known about the core hypothetical constructs?
    • How do you propose they relate to each other?
  • How will the study expand or clarify theory?
    • Use existing theory to explain a new phenomenon (“Divergent” use of theory)?
    • Test contrasting theories of one phenomenon (“Convergent use of theory)?
    • New / expanded theory?

E X A M P L E

Why do college students have riskier sex after drinking alcohol?

College environment

Relaxed norms, impulsiveness

Sexual risk

Disinhibiting effect of alcohol

Experimental Design & sampling

research flow theory5
Research flow: Theory
  • What is known about the core hypothetical constructs?
    • How do you propose they relate to each other?
  • How will the study expand or clarify theory?
    • Use existing theory to explain a new phenomenon (“Divergent” use of theory)?
    • Test contrasting theories of one phenomenon (“Convergent use of theory)?
    • New / expanded theory?

Of course this is our outcome or Dependent Variable

E X A M P L E

Why do college students have riskier sex after drinking alcohol?

College environment

Relaxed norms, impulsiveness

Sexual risk

Disinhibiting effect of alcohol

Experimental Design & sampling

research flow
Research flow
  • What variablesbest represent the hypothetical constructs?
  • What is your prediction about how they are related?
    • Measurement design?
    • Quasi-experiment?
    • True experiment?
  • How have the variables been operationally defined?
    • Alternative operational definitions?
    • Implications of this operational definition?
  • Is the predictor best measured

or manipulated?

    • Virtues / limitations of each approach?
  • Sampling?

Experimental Design & sampling

research flow1
Research flow
  • What variablesbest represent the hypothetical constructs?
  • What is your prediction about how they are related?
    • Measurement design?
    • Quasi-experiment?
    • True experiment?

E X A MP L E

Why do college students have riskier sex after drinking alcohol?

  • How have the variables been operationally defined?
    • Alternative operational definitions?
    • Implications of this operational definition?
  • Is the predictor best measured
  • Hypotheses:
  • Real alcohol leads to more risk than placebo drinks where we just expect alcohol
  • Alcohol will lead to risk by inducing relaxation & impulsivity

Independent variable

or manipulated?

    • Virtues / limitations of each approach?
  • Sampling?

Mediating variable

Experimental Design & sampling

research flow2
Research flow
  • What variablesbest represent the hypothetical constructs?
  • What is your prediction about how they are related?
    • Measurement design?
    • Quasi-experiment?
    • True experiment?

E X A MP L E

Why do college students have riskier sex after drinking alcohol?

  • How have the variables been operationally defined?
    • Alternative operational definitions?
    • Implications of this operational definition?
  • Is the predictor best measured
  • Operational definitions:
  • Simple placebo design: real v. taste of alcohol
  • Relaxation & impulsivity questionnaires

Exp. manipulation

or manipulated?

    • Virtues / limitations of each approach?
  • Sampling?

Measured variable

Experimental Design & sampling

basic designs methods cont
Basic Designs: Methods cont.

Who are your participants?

  • What is your sampling method:
  • Where do you recruit participants?
  • Is your study externally valid?

How will you form your control group?

  • Can you practically & ethically have one?
  • Are you using existing groups?
  • Can participants self-select into a group?
  • Is random assignment or matching feasible?

Probability or Non-Probability sample?

What is your sampling frame?

Does your sample represent the population?

  • How will you present the independent variable?
    • Simple presence v. absence?
    • Different doses?

Experimental Design & sampling

overview
Overview

What do we want to know? Why?

Contrast space: what is compared to what?

Hypothetical constructs?

How are they related?

Specific predictions?

Operational definitions?

Experimental Design & sampling

internal validity
Internal validity
  • Strategy: forming a research approach
  • The validity of research findings:
  • Internal Validity
  • External Validity
  • “Ecological” Validity

Experimental Design & sampling

basics of design internal validity
Basics of Design: Internal Validity

Internal Validity:

Can we validly determine what is causing the results of the experiment?

General Research Hypothesis:the experimental outcome (values of the Dependent Variable)is caused only by the experiment itself(Independent Variable).

Confound: a “3rd variable” (unmeasured variable other than the Independent Variable) actually led to the results.

Core Design Issues:

  • Appropriate control group
  • Equivalentexperimental & control groups (except for the Independent Variable).

Experimental Design & sampling

key threats to internal validity

Treatment

Key threats to internal validity

1. Lack of a control group

One group

Observe2

Observe1

Maturation: Participants may be older / wiser by the post-test

History; Cultural or historical events may occur between pre- and post-test that change the participants

Mortality:Participants may non-randomly drop out of the study

Regression to baseline: Participants who are more extreme at baseline look less extreme over time as a statistical confound.

Reactive Measurement:Participants may change their scores due to being measured twice, not the experimental manipulation.

These processes lessen External Validity

2-group design: if groups differ in one of these there is a core confound.

Experimental Design & sampling

key threats to internal v alidity 2
Key threats to Internal Validity, 2

2. Non-equivalent groups

  • Participant selection
  • Experimental procedures

Confound:

Personal characteristics differ X group

Expectations differ X group (non-blind)

Self-section out differs X group

  • Participants are in an alcohol study, so:
  • We have them self-select which group to be in
  • We let them know which group they are in
  • We encourage members of the alcohol group to drop out if they are uncomfortable

E X A MP L E

Experimental Design & sampling

key threats to internal v alidity 21
Key threats to Internal Validity, 2

2. Non-equivalent groups

  • Participant selection
  • Experimental procedures

Confound:

Groups are not blind

The experimenter is not blind

Procedures differ X group

  • Participants can just figure out whether they got alcohol or not…
  • One person fixes the drinks and runs the study
  • Since some participants are drinking, we treatment more carefully than the control group

E X A MP L E

Experimental Design & sampling

t hreats to internal v alidity details
Threats to Internal Validity, details
  • Non-equivalent groups; Group Assignment
  • Self-selection;
    • Self-selection in
    • Self-selection out
  • Existing groups:
    • Convenience samples
    • Groups that express the phenomenon
  • Cures:
  • Random assignmentto experimental v. control groups.
  • Assess andMatchparticipants on potential confounding variables (demographics, Ψ variables).

People have different motives for joining / dropping out of experimental vs. control groups

  • rare, substantial confound if present.
  • Common in behavioral studies, e.g., more drop out from the experimental group in health interventions.

may differ in subtle ψ variables

  • e.g., 9am class v. 11am class, NYC v. Chicago
  • those who seek therapy v. not, more / less extreme scores at baseline

Experimental Design & sampling

internal validity procedures
Internal validity; procedures
  • Non-equivalent groups; Procedures

Keep all conditions equal (“constant”) across experimental & control groups except the IV.

  • Participants blind
    • Equalize (“control”) expectations & motivations x group
    • Control drop-out, loss to follow-up
  • Experimenter blind
    • Control explicit bias
    • Control self-fulfilling expectations
  • Standardize / automate the experimental process
    • Procedures exactly equal across groups
    • Procedures equal across pre- and post- test

Double blind

Experimental Design & sampling

summary internal validity
Summary: Internal validity

Internal Validity overview:

  • Are results due to something other than the Independent Variable?
  •  Confounds within the experiment
    • Procedural differences x group
    • Biased assignment to group.
  •  Confounds from outside the experiment
    • History, maturation, cultural change etc.
      • within single-group study
      • differences x group in multi-group study

Experimental Design & sampling

external validity
External validity
  • Strategy: forming a research approach
  • The validity of research findings:
  • Internal Validity
  • External Validity
  • “Ecological” Validity

Experimental Design & sampling

generalizability

Other outcomes (dependent variables)

Other conditions

(independent variables)

Generalizability

External Validity:

Can we validly generalize from this experiment to the larger world?

The larger population

Other

settings

External Validity

How well can we generalize to:

Experimental Design & sampling

external validity larger population
External validity: larger population

The larger population

How well does your research sample represent the larger population.

people who volunteer for research may not be typical of the general population.

  • “Volunteerism” bias
    • attitudes, motivations
    • responses to financial incentives
  • Convenience sampling
    • College class, shopping mall, bar, street corner…
    • Personal social network
  • Bias by self-selection.
  • Random selection maximizes external validity by best representing the population.

We will spend several lectures on sampling later on.

easily available, rather than systematically sampled participants

Cure:

Experimental Design & sampling

clickers
Clickers!

Random selection is…

The way Chicago voters elect aldermen

The way people are selected for the experimental v. control groups

The way participants are assigned to experimenters

The way people are selected from the population to be in a research sample.

Experimental Design & sampling

clickers 2
Clickers, 2

Random assignment is…

The way blind dates seem to work

The way people are selected for the experimental v. control groups

The way participants are assigned to experimenters

The way people are selected from the population to be in a research sample.

Experimental Design & sampling

random selection v assignment
Random selection v. assignment
  • Key distinction:
  • Random selection: from a larger population to the research sample.
  • Random assignment: from the sample to experimental v. control groups.

Experimental Design & sampling

external validity settings

Other

settings

External validity: settings

How representative (or realistic) is the social and cultural context of the research

is the setting similar to “real life” settings, or are the results specific to this laboratory, questionnaire, etc.?

  • Context:
  • Procedures:
  • Replicatethe study by different researchers, in different setting(s), with different samples.
  • Convergingstudies that test the same hypotheses with substantially different methods
    • Field v. lab studies
    • Experimental v. non- (or quasi-) experimental methods.
    • Qualitative v. quantitative approaches

are results an “artifact” of a particular procedure, experimenter, or place or setting?

Cures:

Experimental Design & sampling

external validity settings1

Other conditions

(independent variables)

External validity: settings

How representative is the Independent Variable / experimental manipulation

  • “Modeling” the phenomenon:
    • does the experimental condition or manipulation create the state you want it to…?
    • Did we really induce: stress, mood, motivation…
  • Dose of the IV: e.g.;
    • drug dose
    • Psychotherapy intensity
  • Manipulation check
  • Dose-response studies

Cures:

Experimental Design & sampling

external validity settings2

Other outcomes (dependent variables)

External validity: settings

How representative is the dependent Variable / Outcome measure

  • Operationalization:
    • does the assessment of the DV reflect how the process works outside of the lab?
  • Construct validity:
    • Are you modeling the hypothetical construct you intended?
    • How well have you captured a specific Ψ process?

Cures:

Standardized measures e.g., depression, stress…

Psychometric studies: is the measure validand reliable

Experimental Design & sampling

external validity summary

The Independent Variable

External validity: summary

External Validity:

Can we validly generalize from this experiment to the larger world?

Is the sample typical of the larger population?

The research Sample:

Is this typical of “real world” settings where the phenomenon occurs?

The research

Setting:

Is the outcome measure represen-tative, valid & reliable?

The Dependent Variable

The study structure & context

Does the experimental manipulation (or measured predictor) actually create (validly assess…)the phenomenon you are interested in?

Experimental Design & sampling

generalizability example

Evaluation anxiety disrupts learning

Hypothesis:

Anxiety manipulation: class task described as I.Q. test v. non-evaluative exercise

  • IndependentVariable:
  • Dependent Variable:

Operational def. of “learning”: Abstract memory task

Generalizability example
  • Core design elements (external validity areas):
  • Sample: UIC students
  • Setting: Classroom situation

E X A M P L E

Experimental Design & sampling

generalizability population and context

How well do the results generalize to…

Generalizability: population and context

Larger population(s)

People in general

Across contexts

Other Americans

Other University settings

Other structured settings

“Real world” situations such as at work

Other young people

E X A M P L E

UIC students tested in class

Experimental Design & sampling

generalizability independent dependent variables
Generalizability: Independent & dependent variables.

External validity:Do the results generalize to…

Other forms of anxiety

(the IV)

“Natural” anxiety

Other outcomes (DV)

Other forms of stress

Less structured leaning tasks

Other cognitive tasks

Job or other performance…

Other instructions

E X A M P L E

“I.Q.” instructions & abstract memory task

Experimental Design & sampling

generalizability1

Ind. Var.:

“IQ test” instruction

Generalizability
  • How well do these data generalize to….

The larger population?

Sample:

UIC Students

Setting:

Classroom

Dep. Var.:

Abstract memory task

E X A M P L E

Other cognitive skills or tasks?

Other social or learning settings?

The study structure & context

Other “anxiety” conditions?

Experimental Design & sampling

generalizability general research results

Other outcomes

Other conditions

(independent variables)

Generalizability: general research results

Each element of external validity helps determine how meaningful research results are.

The larger population

Other

settings

External Validity

How well can we generalize to:

Experimental Design & sampling

external validity1
External validity
  • Strategy: forming a research approach
  • The validity of research findings:
  • Internal Validity
  • External Validity
  • “Ecological” Validity

Experimental Design & sampling

context effects
Context effects

Ecological Validity

The research setting & culture

The researcher

The participant

How valid is the larger context or “ecology” of the research?

Experimental Design & sampling

context effects1
Context effects

Ecological Validity

Potential conflicts between the researcher and the participant.

The researcher

The Participant

Biosocial [age, race, gender, status...]

  • Inherent social conflicts?

Psychosocial [attitudes, warmth, skills...]

  • Cooperation?
  • Communication to participants?

Situational [e.g., physician, teacher as researcher]

  • Prior relationship or 'dual role' situation
  • Mutual comfort?

Experimental Design & sampling

slide45

Ecological Validity, 2: Researcher Effects

Possible Biases from The Researcher

  • Self-fulfilling expectations (verbal or non-verbal).
    • Rosenthal experiment: “smart” v. “dumb” rats & maze learning.
    • Education research: powerful effects of teacher expectations on student performance.
  • Biased procedures or handling of participants.
    • Clinical research; differential handling of “cases”.
    • Mental health research: more extreme diagnosis for minorities / lower SES pts.
  • Biased data recording: quantitative and qualitative
    • Non-random errors in data coding or entry
    • Confirmatory biases in recall.

The researcher

Experimental Design & sampling

researcher effects cures
Researcher effects; cures

Possible Biases from The Researcher

Cures:

The researcher

  • Randomize experimentersacross condition
      • match or stratify experimenter x participant
      • 'unknown' experimenters
    • Blindingof experimenter(s)
    • double blind
    • Not informing 'hands on' researcher (when blinding impossible)
    • Aggressive standardization & automation

Experimental Design & sampling

participant effects
Participant effects

Ecological Validity; Participant biases

  • Motivation to be a 'good' (or bad) subject
  • Social desirability responding
    • Primarily for personal information
    • Cultural & personal differences in what is considered “personal”
    • Face to face v. computer assessment
    • Changes in response over time (Doll et al.; risk disclosure).
  • Infer hypothesis or enrollment criteria(correctly or incorrectly)
    • HIV vaccine research:
      • Risky men lied to get into “low risk” vaccine cohorts, then showed HIV infections. Did the vaccine itself “cause” infections?
      • Reactive risk behavior: concentrated in men who believed they received the vaccine.

The Participant

Experimental Design & sampling

participant effects 3
participant effects, 3.

Ecological Validity; Participant biases

  • Blindingparticipants
  • Constancy of procedures:
    • automation or structured protocol
    • training researchers
  • Deception or concealment of hypothesis
  • Diverse sampling of participants
  • Computer assessment

Cures:

The Participant

Experimental Design & sampling

ecological validity context and people
Ecological Validity: Context and people

Ecological Validity; The Research setting

  • Social context powerfully affects “individual” behavior:
    • Zimbardo prison experiment, Rosenthal & psychiatric settings
    • Medical context and health measures; e.g, "white coat" effect
    • Self-awareness  “norm following”
  • Context and informational availability:
    • Minimalist social psychology experiments & social judgments.
    • Survey / interview measures and uni-dimensional responding
  • Political / economic demands and simple bias or fraud
    • Drug Co. research and “cherry picking” positive results
    • Political pressure for “No Child Left Behind” & Houston Miracle

Experimental Design & sampling

ecological validity the research setting
Ecological validity: The research setting

Ecological Validity; The Research setting

Reactive measurement

  • Taking the same measure twice may change responses
  • Awareness of the hypothesis  response biases
  • Book example: responses to erotic stimulus (among “helpful” participants).
    • due to the stimulus itself?
    • self-generated imagery to conform to the experimental demand?
  • simulated response?
  • Research measures can create attitude change
    • Survey questions & “normalizing”…

When do you feel

it is O.K. to cheat

on an exam?

..when I really do not know the material

.. when others are doing it

.. when I think the exam is unfair

  • Political “push” polls
  • How much do you think Obama is destroying America?
  • How much will Obamacare bankrupt America, economically and morally?

Experimental Design & sampling

setting effects cures
Setting effects: Cures

Ecological Validity; The Research setting

  • Cures:
    • Clear description of research context to aid interpretation
    • Replication of research in other settings / labs / researchers
    • Converging studies that test the same hypothesis with different methods / contexts, sources of participants, & measures.

Experimental Design & sampling

overview1
Overview

Ecological Validity

  • Inherent conflicts:
    • Biosocial / Ψ
    • Situational: dual role
  • Researcher effects:
    • Expectancies
    • Biased procedures
  • Participanteffects:
    • Social desirability responding
    • Infer hypothesis  change behavior
  • The setting:
    • Social context  powerful behavioral effects
    • Reactive measurement

Experimental Design & sampling

clickers 21
Clickers, 2

What does it mean to model a phenomenon?

Wear it strutting down a runway

Use it in a theory

Run a correlational study with it

Create experimental conditions that validly express it

Experimental Design & sampling

clickers 22
Clickers, 2

What does it mean to replicate a study?

Repeat it exactly

Study it closely

Conduct a similar but different study

Create a monstrous copy

Experimental Design & sampling

clickers 23
Clickers, 2

Double blind means?

Blind in one eye, can’t see out the other

Both the experimenter and the Principle Investigator don’t know what group you are in

An experiment that uses deception

The researcher and the participant are not told the group assignments

Experimental Design & sampling

design validity overview
Design & validity overview
  • Overall research questions
  • Internal validity (“confounds”)
    • Group assignments
    • Procedures
  • External validity
    • Sample population?
    • Context: Research lab  “real” settings & contexts?
    • Conditions: Independent variable  “real” conditions?
    • Outcomes: Dependent variable  adequate “model” of phenomenon?
  • Ecological validity (context & conditions):
    • Researcher effects
    • Participant effects
    • Setting effects
  • Operational Definitions, hypothetical constructs, confounds, etc.

Experimental Design & sampling

overview 2
Overview, 2
  • Some key terms:
    • Internal validity
    • External validity
    • Replicate
    • “Converging” study
    • Blind
    • Double blind
    • Social desirability responding
    • Reactive measures

Experimental Design & sampling