Experimental design statistical analysis
This presentation is the property of its rightful owner.
Sponsored Links
1 / 51

Experimental Design, Statistical Analysis PowerPoint PPT Presentation


  • 83 Views
  • Uploaded on
  • Presentation posted in: General

Experimental Design, Statistical Analysis. CSCI 4800/6800 University of Georgia March 7, 2002 Eileen Kraemer. Research Design. Elements: Observations/Measures Treatments/Programs Groups Assignment to Group Time. Observations/Measure. Notation: ‘O’ Examples: Body weight

Download Presentation

Experimental Design, Statistical Analysis

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Experimental design statistical analysis

Experimental Design, Statistical Analysis

CSCI 4800/6800

University of Georgia

March 7, 2002

Eileen Kraemer


Research design

Research Design

  • Elements:

    • Observations/Measures

    • Treatments/Programs

    • Groups

    • Assignment to Group

    • Time


Observations measure

Observations/Measure

  • Notation: ‘O’

    • Examples:

      • Body weight

      • Time to complete

      • Number of correct response

  • Multiple measures: O1, O2, …


Treatments or programs

Treatments or Programs

  • Notation: ‘X’

    • Use of medication

    • Use of visualization

    • Use of audio feedback

    • Etc.

  • Sometimes see X+, X-


Groups

Groups

  • Each group is assigned a line in the design notation


Assignment to group

Assignment to Group

  • R = random

  • N = non-equivalent groups

  • C = assignment by cutoffs


Experimental design statistical analysis

Time

  • Moves from left to right in diagram


Types of experiments

Types of experiments

  • True experiment – random assignment to groups

  • Quasi experiment – no random assignment, but has a control group or multiple measures

  • Non-experiment – no random assignment, no control, no multiple measures


Design notation example

Design Notation Example

Pretest-posttest treatment

comparison group

randomized experiment


Experimental design statistical analysis

Design Notation Example

Pretest-posttest

Non-Equivalent Groups

Quasi-experiment


Experimental design statistical analysis

Design Notation Example

Posttest Only

Non-experiment


Goals of design

Goals of design ..

  • Goal:to be able to show causality

  • First step: internal validity:

    • If x, then y

      AND

    • If not X, then not Y


Two group designs

Two-group Designs

  • Two-group, posttest only, randomized experiment

Compare by testing for differences between means of groups, using t-test or one-way Analysis of Variance(ANOVA)

Note: 2 groups, post-only measure, two distributions each with mean and variance, statistical (non-chance) difference between groups


To analyze

To analyze …

  • What do we mean by a difference?


Possible outcomes

Possible Outcomes:


Measuring differences

Measuring Differences …


Three ways to estimate effect

Three ways to estimate effect

  • Independent t-test

  • One-way Analysis of Variance (ANOVA)

  • Regression Analysis (most general)

  • equivalent


Computing the t value

Computing the t-value


Computing the variance

Computing the variance


Regression analysis

Regression Analysis

Solve overdetermined system of equations for β0 and β1, while minimizing sum of e-terms


Regression analysis1

Regression Analysis


Anova

ANOVA

  • Compares differences within group to differences between groups

  • For 2 populations, 1 treatment, same as t-test

  • Statistic used is F value, same as square of t-value from t-test


Other experimental designs

Other Experimental Designs

  • Signal enhancers

    • Factorial designs

  • Noise reducers

    • Covariance designs

    • Blocking designs


Factorial designs

Factorial Designs


Factorial design

Factorial Design

  • Factor – major independent variable

    • Setting, time_on_task

  • Level – subdivision of a factor

    • Setting= in_class, pull-out

    • Time_on_task = 1 hour, 4 hours


Factorial design1

Factorial Design

  • Design notation as shown

  • 2x2 factorial design (2 levels of one factor X 2 levels of second factor)


Outcomes of factorial design experiments

Outcomes of Factorial Design Experiments

  • Null case

  • Main effect

  • Interaction Effect


The null case

The Null Case


The null case1

The Null Case


Main effect time

Main Effect - Time


Main effect setting

Main Effect - Setting


Main effect both

Main Effect - Both


Interaction effects

Interaction effects


Interaction effects1

Interaction Effects


Statistical methods for factorial design

Statistical Methods for Factorial Design

  • Regression Analysis

  • ANOVA


Anova1

ANOVA

  • Analysis of variance – tests hypotheses about differences between two or more means

  • Could do pairwise comparison using t-tests, but can lead to true hypothesis being rejected (Type I error) (higher probability than with ANOVA)


Between subjects design

Between-subjects design

  • Example:

    • Effect of intensity of background noise on reading comprehension

    • Group 1: 30 minutes reading, no background noise

    • Group 2: 30 minutes reading, moderate level of noise

    • Group 3: 30 minutes reading, loud background noise


Experimental design

Experimental Design

  • One factor (noise), three levels(a=3)

  • Null hypothesis: 1 =2 =3


Notation

Notation

  • If all sample sizes same, use n, and total N = a * n

  • Else N = n1 + n2 +n3


Assumptions

Assumptions

  • Normal distributions

  • Homogeneity of variance

    • Variance is equal in each of the populations

  • Random, independent sampling

  • Still works well when assumptions not quite true(“robust” to violations)


Anova2

ANOVA

  • Compares two estimates of variance

    • MSE – Mean Square Error, variances within samples

    • MSB – Mean Square Between, variance of the sample means

  • If null hypothesis

    • is true, then MSE approx = MSB, since both are estimates of same quantity

    • Is false, the MSB sufficiently > MSE


Experimental design statistical analysis

MSE


Experimental design statistical analysis

MSB

  • Use sample means to calculate sampling distribution of the mean,

    = 1


Experimental design statistical analysis

MSB

  • Sampling distribution of the mean * n

  • In example, MSB = (n)(sampling dist) = (4) (1) = 4


Is it significant

Is it significant?

  • Depends on ratio of MSB to MSE

  • F = MSB/MSE

  • Probability value computed based on F value, F value has sampling distribution based on degrees of freedom numerator (a-1) and degrees of freedom denominator (N-a)

  • Lookup up F-value in table, find p value

  • For one degree of freedom, F == t^2


Factorial between subjects anova two factors

Factorial Between-Subjects ANOVA, Two factors

  • Three significance tests

    • Main factor 1

    • Main factor 2

    • interaction


Example experiment

Example Experiment

  • Two factors (dosage, task)

  • 3 levels of dosage (0, 100, 200 mg)

  • 2 levels of task (simple, complex)

  • 2x3 factorial design, 8 subjects/group


Summary table

Summary table

SOURCE df Sum of Squares Mean Square F p

Task 1 47125.3333 47125.3333 384.174 0.000

Dosage 2 42.6667 21.3333 0.174 0.841

TD 2 1418.6667 709.3333 5.783 0.006

ERROR 42 5152.0000 122.6667

TOTAL 47 53738.6667

  • Sources of variation:

    • Task

    • Dosage

    • Interaction

    • Error


Results

Results

  • Sum of squares (as before)

  • Mean Squares = (sum of squares) / degrees of freedom

  • F ratios = mean square effect / mean square error

  • P value : Given F value and degrees of freedom, look up p value


Results example

Results - example

  • Mean time to complete task was higher for complex task than for simple

  • Effect of dosage not significant

  • Interaction exists between dosage and task: increase in dosage decreases performance on complex while increasing performance on simple


Results1

Results


  • Login