1 / 44

Mixed designs

Mixed designs. Mixed designs. We’ve discussed between groups designs looking at differences across independent samples We’ve also talked about within groups designs looking for differences across treatments in which subjects participate in each treatment. Between groups design.

nikkos
Download Presentation

Mixed designs

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Mixed designs

  2. Mixed designs • We’ve discussed between groups designs looking at differences across independent samples • We’ve also talked about within groups designs looking for differences across treatments in which subjects participate in each treatment.

  3. Between groups design • Typing speed: random assignment to Music or No Music conditions

  4. Repeated Measures example Here each person is measured in the Music and No Music conditions

  5. Between groups and RM • The research question can often determine the design, however there are some factors that we could not examine in repeated measures design (e.g. ethnicity), and others where the dependency would preclude a between groups design (pre-post conditions) • In cases where we might have a choice (as with in the previous example) RM design would most likely be preferred • When subjects are observed only once, their differences contribute to the error term. On repeated occasions we can obtain an estimate of the degree of subject differences and partial that out of the error term • Fewer subjects needed • More power

  6. Mixed design • A x (B x S) • At least one between, one within subjects factor • Each level of factor A contains a different group of e.g. randomly assigned subjects • On the other hand, each level of factor B at any given level of factor A contains the same subjects

  7. Partitioning the variability • Partitioning the variance is done as for a standard ANOVA • Between groups variance • Within groups variance • What error term do we use for the respective between and within subjects factors, as well as the interaction of the two?

  8. Partitioning the variability • Again we adopt the basic principle we have followed previously in looking for effects. We want to separate between treatment effects and error: • A part due to the manipulation of a variable, the treatment part (treatment effects) • A second part due to all other unsystematic or uncontrolled sources of variability (error) • The deviation of scores associated with the error (unaccounted for variance) can be divided into two different components: • Between Subjects Error • Estimates the extent to which chance factors are responsible for any differences among the different levels of the between subjects factor. • Within Subjects Error • Estimates the extent to which chance factors are responsible for any differences observed within the same subject

  9. How it breaks down SStotal SSb/t subjects SSw/in subjects SSA SSsubj w/in groups SSB SSAxB SSerror (Bxsubject) a-1 a(s-1) b-1 (a-1)(b-1) a(b-1)(s-1) df =

  10. Comparing the different designs • Note that the between groups outcome (F and p-value) is the same whether a part of the mixed or standard b/t groups design • In the mixed, the repeated measures are ‘collapsed’, making each subjects score for the between groups factor the mean of those repeated measures • The same is true for the within groups design, except in the mixed the subjects are nested within the factor of A, and the interaction of A X B is taken out of the error term • The SSb/t subj in the Within Design is the error term for the between groups factor in the mixed • The error terms are in blue • B/t groups Design W/in groups Design Mixed Design • SSA SSA • SSA/S SSSSSA/S • SSB SSB • SSBxSSSAxB • SSBxS

  11. Comparing the different designs • The SSb/t subjects in general reflects the deviation of subjects from the grand mean while the SSw/in in general reflects their deviation from their own mean

  12. Example • 2 x 3 mixed factorial design • Gender and tv viewing habits (hours watched per week) drama comedy news male 4 7 2 male 3 5 1 male 7 9 6 male 6 6 2 male 5 5 1 female 8 2 5 female 4 1 1 female 6 3 4 female 9 5 2 female 7 1 1

  13. Main Effects

  14. Univariate output From Mixed output Compared to separate designs • Between subjects output • If one collapses the RM variables and performs the 1-way ANOVA on the resulting dependent variable of subject means the results are the same as in our mixed output

  15. However in the mixed design, the interaction sums of squares is taken out of what would have been simply error in the one-way design. Compared to separate designs • Similarly, if we ignore gender and run a one-way RM, we can see that this result is contained within the mixed design

  16. Interaction

  17. General Result • No main effect for gender • Main effect for tv show, but also gender x tv show interaction

  18. Simple effects • Comparisons reveal a statistical difference for gender in viewing comedy programs but not for others1

  19. Assumptions • Usual suspects normality, homogeneity of variance, sphericity • For Between subjects effects, variances across groups must be similar • Also for the within subjects effects we have an HoV requirement • That the error (tvshow by subject interaction) is the same for all groups

  20. Assumptions • In addition, the sphericity assumption extends beyond the within subjects factor • Our var/covar matrices must be similar across the between groups factor (gender) • Furthermore, the pooled (average/overall) var/covar matrix of the group var/covar matrices should be spherical • If the first is ok the second will be • Gist: variances of all possible difference scores among the treatments should be similar

  21. Post hocs and contrasts • If no statistically significant interaction, one may conduct post hoc analysis on the significant main effects factors as described previously • Planned contrasts can be conducted to test specific hypotheses

  22. Planned contrasts • Focused contrasts can get complicated regarding interactions • The following example is for a between groups factorial design regarding treatment of depression • Example Age x Therapy • Row and column weights must sum to zero • Does the effect of hospitalization vary as a function of a linear trend with age? • Younger benefit more from nonhospitalization Non-Hospitalization Hospitalization Psychotherapy Companion Traditional Milieu 1 1 -1 -1 Old 1 1 1 -1 -1 Middle 0 0 0 0 0 Young -1 -1 -1 1 1

  23. Planned contrasts • Example weights for testing a linear trend for age in groups psycoth and traditional (opposite to each other), quadratic for companion and milieu (also opposite) • We could break down the interaction into an orthogonal set of contrasts • Sum up to the interaction (sums of squares) Non Hospitalization Psychoth Companion Traditional Milieu Old -1 1 1 -1 Middle 0 -2 0 2 Young 1 1 -1 -1

  24. Planned contrasts • With mixed designs it can be difficult to determine the appropriate error term • Consult texts such as Keppel, or Rosenthal and Rosnow for ideas on how to proceed • Essentially we will have a interaction contrast x subjects error • Furthermore, it has been shown by some that such analyses can be very sensitive to violations of our assumptions (sphericity)

  25. More complex mixed designs • Of course we may have multiple between or within groups factors • Gist of the approach is pretty much the same for multiple factors of either between or within subjects factors • We are interested in interactions involving the two types of factors

  26. Two between one within • In this case we will have our typical factorial output and with interaction etc. to interpret • Now we will also look to see if the between subjects interaction changes over the levels of the repeated measure

  27. Example • Anxiety in final weeks of the semester guys A&S 3 1 4 6 7 guys A&S 1 2 5 5 5 guys A&S 4 6 7 7 8 guys Business 0 4 4 7 8 guys Business 2 3 5 7 8 guys Business 0 4 4 4 8 guys Music 1 3 3 4 4 guys Music 1 3 3 5 6 guys Music 1 4 7 7 8 guys Education 3 5 8 7 6 guys Education 0 2 3 6 4 guys Education 2 1 2 5 5 gals A&S 3 3 5 7 7 gals A&S 0 1 3 2 4 gals A&S 2 5 6 6 7 gals Business 1 3 6 5 6 gals Business 0 4 6 7 6 gals Business 2 2 3 5 7 gals Music 2 3 5 7 8 gals Music 0 4 5 8 8 gals Music 1 4 5 7 7 gals Education 1 4 4 5 8 gals Education 1 2 4 6 8 gals Education 2 5 6 7 7

  28. Breakdown of SS

  29. Write Down Your Expectations Now

  30. Results • Regardless of gender or college affiliated with, anxiety increases at approximately the same rate as one approaches finals • Shocking!

  31. Results All between subjects effects are not noticeable statistically or practically Only time is the noticeable factor in this study

  32. One between Two within • Again we will have our typical output as we would with a two within design • We will also look to see if the within subjects interaction changes over the levels of the between subjects factor

  33. Example • Are there differing effects for age regarding verbal and visuospatial ability? • Age x (Verbal/visuo-spatial ability x Block) • 2 x (2 x 6)

  34. Breakdown

  35. Main Effects • Start simple and build from there • Use visual displays to keep things straight • All three main effects significant

  36. 2 way interactions • Only type of task by block was close p = .057, but PES = .022 pretty small • Though started out similarly, less improvement over blocks for visuospatial task

  37. 3 way interaction • Significant • No real interaction for young b/t type of task and rate of improvement • With older folk we see the interaction alluded to in the previous 2-way

  38. Results

  39. Simple effects • In order to test for simple effects we must have the appropriate error term for analysis • Breakdown of general error terms for the previous designs (2 within on left, 2 between subjects factors on right; from Keppel)

  40. Simple effects • Error terms for simple effects (from Winer) • Comparison to the appropriate critical value with appropriate degrees of freedom for pooled sources of variability from mixed sources can get a little weird • Consult an appropriate text 2 between 1 within: A x B x C 1 between 2 within: A x (B x C) *q and r refer to the number of levels of the repeated measures factors B and/or C MSA x subj = MSerror(a) MSB x subj = MSerror(b) MSC x subj = MSerror(c) MSBC x subj = MSerror(bc)

  41. Summary • Mixed design encompasses at least one between subjects factor (independent groups) and one repeated measures factor • The approach is the same as it was for either separately- Look for main effects and interactions • In the simplest setting an interaction suggests that the between groups differences are changing over the levels of the repeated measure (or the repeated measure effect is varies depending on which group you are talking about) • With more complex interactions, interactions are changing over the levels of another variable. • The best approach is to start simple (examine main effects) and work your way up, and in the presence of a significant interaction, make sure that your simple effects are tested appropriately

  42. Appendix

  43. In SPSS • In SPSS, though we have a between groups factor we’ll still use the RM menu

  44. R output for first mixed example Error: Subject Df Sum Sq Mean Sq F value Pr(>F) Sex 1 3.333 3.333 0.4717 0.5116 Residuals 8 56.533 7.067 Error: Subject:Show Df Sum Sq Mean Sq F value Pr(>F) Show 2 58.067 29.033 22.051 2.523e-05 *** Sex:Show 2 44.867 22.433 17.038 0.0001086 *** Residuals 16 21.067 1.317 --- Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

More Related