1 / 36

Jacob Westfall University of Colorado Boulder Charles M. Judd David A. Kenny

Treating Stimuli as a Random Factor in Social Psychology : A New and Comprehensive Solution to a Pervasive but Largely Ignored Problem . Jacob Westfall University of Colorado Boulder Charles M. Judd David A. Kenny University of Colorado Boulder University of Connecticut.

harlan
Download Presentation

Jacob Westfall University of Colorado Boulder Charles M. Judd David A. Kenny

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Treating Stimuli as a Random Factor in Social Psychology:A New and Comprehensive Solutionto a Pervasive but Largely Ignored Problem Jacob Westfall University of Colorado Boulder Charles M. Judd David A. Kenny University of Colorado Boulder University of Connecticut

  2. What to do about replicability? • Mandatory reporting of all DVs, studies, etc.? • Journals or journal sections devoted to straight replication attempts? • Pre-registration of studies? • Many of the proposed solutions involve large-scale institutional changes, restructuring incentives, etc. • These are good ideas worthy of discussing, but surely not quick or easy to implement

  3. One way to increase replicability:Treat stimuli as random • Failure to account for uncertainty associated with stimulus sampling (i.e., treating stimuli as fixed rather than random) leads to biased, overconfident estimates of effects (Clark, 1973; Coleman, 1964) • The pervasive failure to model stimulus as a random factor is probably responsible for many failures to replicate when future studies use different stimulus samples

  4. Doing the correct analysis is easy! • Recently developed statistical methods solve the statistical problem of stimulus sampling • These mixed models with crossed random effects are easy to apply and are already widely available in major statistical packages (R, SAS, SPSS, Stata, etc.)

  5. Outline of rest of talk • The problem • Illustrative design and typical RM-ANOVA analyses • Estimated type 1 error rates • The solution • Introducingmixed models with crossed random effects for participants and stimuli • Applications of mixed model analyses to actual datasets

  6. Illustrative Design • Participants crossed with Stimuli • Each Participant responds to each Stimulus • Stimuli nested under Condition • Each Stimulus always in either Condition A or Condition B • Participants crossed with Condition • Participants make responses under both Conditions Sample of hypothetical dataset:

  7. Typical repeated measures analyses (RM-ANOVA) How variable are the stimulus ratings around each of the participant means? The variance is lost due to the aggregation • “By-participant analysis”

  8. Typical repeated measures analyses (RM-ANOVA) 4.00 3.67 6.33 7.33 3.67 6.33 8.00 6.00 8.00 4.00 5.00 5.33 Sample 1 v.s. Sample 2 “By-stimulus analysis”

  9. Simulation of type 1 error rates for typical RM-ANOVA analyses • Design is the same as previously discussed • Draw random samples of participants and stimuli • Variance components = 4, Error variance = 16 • Number of participants ∈ {10, 30, 50, 70, 90} • Number of stimuli ∈{10, 30, 50, 70, 90} • Conducted both by-participant and by-stimulus analysis on each simulated dataset • True Condition effect = 0

  10. Type 1 error rate simulation results • The exact simulated error rates depend on the variance components, which although realistic, were ultimately arbitrary • The main points to take away here are: • The standard analyses will virtually always show some degree of positive bias • In some (entirely realistic) cases, this bias can be extreme • The degree of bias depends in a predictable way on the design of the experiment (e.g., the sample sizes)

  11. The old solution: Quasi-F statistics • Although quasi-Fs successfully address the statistical problem, they suffer from a variety of limitations • Require complete orthogonal design (balanced factors) • No missing data • No continuous covariates • A different quasi-F must be derived (often laboriously) for each new experimental design • Not widely implemented in major statistical packages

  12. The new solution: Mixed models • Known variously as: • Mixed-effects models, multilevel models, random effect models, hierarchical linear models, etc. • Most social psychologists familiar with mixed models for hierarchical random factors • E.g., students nested in classrooms • Less well known is that mixed models can also easily accommodate designs with crossed random factors • E.g., participants crossed with stimuli

  13. Grand mean = 100

  14. MeanA = -5 MeanB = 5

  15. Participant Intercepts 5.86 7.09 -1.09 -4.53

  16. Stim. Intercepts: -2.84 -9.19 -1.16 18.17

  17. Participant Slopes 3.02 -9.09 3.15 -1.38

  18. Everything else = residual error

  19. The linear mixed-effects modelwith crossed random effects Fixed effects Random effects

  20. The linear mixed-effects modelwith crossed random effects Intercept Slope 6 parameters

  21. Fitting mixed models is easy: Sample syntax R library(lme4) model <- lmer(y ~ c + (1 | j) + (c | i)) proc mixed covtest; class i j; model y=c/solution; random intercept c/sub=i type=un; random intercept/sub=j; run; SAS MIXED y WITH c /FIXED=c /PRINT=SOLUTION TESTCOV /RANDOM=INTERCEPT c | SUBJECT(i) COVTYPE(UN) /RANDOM=INTERCEPT | SUBJECT(j). SPSS

  22. Mixed models successfully maintain the nominal type 1 error rate (α = .05)

  23. Applications to existing datasets • Representative simulated dataset (for comparison) • Afrocentric features data (Blair et al., 2002, 2004, 2005) • Shooter data (Correll et al., 2002, 2007) • Psi / Retroactive priming data (Bem) • Forward-priming condition (classic evaluative priming effect) • Reverse-priming condition (psi condition)

  24. Comparison of effectsbetween RM-ANOVA and mixed model analyses

  25. Comparison of effectsbetween RM-ANOVA and mixed model analyses

  26. Comparison of effectsbetween RM-ANOVA and mixed model analyses

  27. Conclusion • Many failures of replication are probably due to sampling stimuli and the failure to take that into account • Mixed models with crossed random effects allow for generalization to future studies with different samples of both stimuli and participants

  28. The end Further reading: Judd, C. M., Westfall, J., & Kenny, D. A. (2012). Treating stimuli as a random factor in social psychology: A new and comprehensive solution to a pervasive but largely ignored problem. Journal of personality and social psychology, 103(1), 54-69.

More Related