1 / 44

Chapter Thirteen

Chapter Thirteen. Hypothesis Testing for Two or More Means: The One-Way Analysis of Variance. More Statistical Notation. Analysis of variance is abbreviated as ANOVA An independent variable is called a factor

moriah
Download Presentation

Chapter Thirteen

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Chapter Thirteen Hypothesis Testing for Two or More Means: The One-Way Analysis of Variance

  2. More Statistical Notation • Analysis of variance is abbreviated as ANOVA • An independent variable is called a factor • Each condition of the independent variable is also called a level or a treatment, and differences produced by the independent variable are a treatment effect • The symbol for the number of levels in a factor is k

  3. An Overview of ANOVA

  4. One-Way ANOVA A one-way ANOVA is performed when only one independent variable is tested in the experiment

  5. Between Subjects • When an independent variable is studied using independent samples in all conditions, it is called a between-subjects factor • A between-subjects factor involves using the formulas for a between-subjects ANOVA

  6. Within Subjects Factor • When a factor is studied using related (dependent) samples in all levels, it is called a within-subjects factor • This involves a set of formulas called a within-subjects ANOVA

  7. Analysis of Variance • The analysis of variance is the parametric procedure for determining whether significant differences occur in an experiment containing two or more sample means • In an experiment involving only two conditions of the independent variable, you may use either a t-test or the ANOVA

  8. Diagram of a Study Having ThreeLevels of One Factor

  9. Experiment-Wise Error • The overall probability of making a Type I error somewhere in an experiment is call the experiment-wise error rate • When we use a t-test to compare only two means in an experiment, the experiment-wise error rate equals a

  10. Comparing Means • When there are more than two means in an experiment, the multiple t-tests result in an experiment-wise error rate that this much larger than the a we have selected • Using the ANOVA allows us to compare the means from all levels of the factor and keep the experiment-wise-error rate equal to a

  11. Assumptions of the ANOVA • Each condition contains a random sample of interval or ratio scores • The population represented in each condition forms a normal distribution • The variances of all populations represented are homogeneous

  12. Statistical Hypotheses

  13. The F-Test • The statistic for the ANOVA is F • When Fobt is significant, indicates only that somewhere among the means at least two of them differ significantly • It does not indicate which specific means differ significantly • When the F-test is significant, we perform post hoc comparisons

  14. Post Hoc Comparisons • Post hoc comparisons are like t-tests • We compare all possible pairs of level means from a factor, one pair at a time

  15. Components of ANOVA

  16. Sources of Variance • There are two potential sources of variance • Scores may differ from each other even when participants are in the same condition. This is called variance within groups • Scores may differ from each other because they are from different conditions. This is called the variance between groups

  17. Mean Squares • The mean square within groups is an estimate of the variability in scores as measured by differences within the conditions of an experiment • The mean square between groups is an estimate of the differences in scores that occurs between the levels in a factor

  18. The F-Distribution The F-distribution is the sampling distribution showing the various values of F that occur when H0 is true and all conditions represent one population

  19. Sampling Distribution of F When H0 Is True

  20. Degrees of Freedom • The critical value of F (Fcrit) depends on • The degrees of freedom (both the dfbn = k - 1 and the dfwn = N - k) • The a selected • The F-test is always a two-tailed test

  21. Computing the F-Ratio

  22. Sum of Squares • The computations for the ANOVA require the use of several sums of squared deviations • Each of these terms is called the sum of squares and is symbolized by SS

  23. Summary Table of a One-way ANOVA Source Sum of df Mean F Squares Squares Between SSbndfbnMSbnFobt Within SSwndfwnMSwn Total SStotdftot

  24. Computing Fobt • Compute the total sum of squares (SStot)

  25. Computing Fobt • Compute the sum of squares between groups (SSbn)

  26. Computing Fobt • Compute the sum of squares within groups (SSwn) • SSwn = SStot - SSbn

  27. Computing Fobt • Compute the degrees of freedom • The degrees of freedom between groups equals k - 1 • The degrees of freedom within groups equals N - k • The degrees of freedom total equals N - 1

  28. Computing Fobt • Compute the mean squares

  29. Computing Fobt • Compute Fobt

  30. Performing Post Hoc Comparisons

  31. Fisher’s Protected t-Test • When the ns in the levels of the factor are not equal, use Fisher’s protected t-test

  32. Tukey’s HSD Test • When the ns in all levels of the factor are equal, use the Tukey HSD multiple comparisons test where qk is found using the appropriate table

  33. Describing the Relationshipin a One-Way ANOVA

  34. Confidence Interval • The computational formula for the confidence interval for a single m is

  35. Graphing the Results in ANOVA A graph showing means from three conditions of an independent variable.

  36. Proportion of Variance Accounted For • Eta squared indicates the proportion of variance in the dependent variable that is accounted for by changing the levels of a factor

  37. Omega Squared • In some instances, the effect size is reported using the measurement omega squared • is an estimate of the proportion of the variance in the population that would be accounted for by the relationship

  38. Example • Using the following data set, conduct a one-way ANOVA. Use a = 0.05

  39. Example

  40. Example • dfbn = k - 1 = 3 - 1 = 2 • dfwn = N - k = 18 - 3 = 15 • dftot = N - 1 = 18 - 1 = 17

  41. Example

  42. Example • Fcrit for 2 and 15 degrees of freedom and a = 0.05 is 3.68 • Since Fobt = 4.951, the ANOVA is significant • A post hoc test must now be performed

  43. Example • The mean of sample 3 is significantly different from the mean of sample 2

More Related