Using meta-analyses in your literature review. BERA Doctoral Workshop 3rd September 2008 Professor Steven Higgins Durham University firstname.lastname@example.org. Acknowledgements.
Related searches for Using meta-analyses in your literature review
Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.
BERA Doctoral Workshop
3rd September 2008
Professor Steven Higgins
1952: Hans J. Eysenck concluded that there were no favorable effects of psychotherapy, starting a raging debate which 25 years of evaluation research and hundreds of studies failed to resolve
1978: To proved Eysenck wrong, Gene V. Glass statistically aggregated the findings of 375 psychotherapy outcome studies
Glass (and colleague Smith) concluded that psychotherapy did indeed work - “the typical therapy trial raised the treatment group to a level about two-thirds of a standard deviation on average above untreated controls; the average person received therapy finished the experiment in a position that exceeded the 75th percentile in the control group on whatever outcome measure happened to be taken”(Glass, 2000). Glass called the method “meta-analysis”
( adapted from Lipsey & Wilson, 2001)
Averaged correlations for typhoid mortality after inoculation across 5 samples
“When a number of quite independent tests of significance have been made … although few or none can be claimed individually as significant, yet the aggregate gives an impression that the probabilities are on the whole lower than would often have been obtained by chance” (p. 99).
Source of the idea of cumulating probability values
Discusses a method of averaging means across independent studies
Set out much of the statistical foundation for meta-analysis (e.g., Inverse variance weighting and homogeneity testing)
( adapted from Lipsey & Wilson, 2001)
From: Marzano, R. J. (1998) A Theory-Based Meta-Analysis of Research on Instruction. Aurora, Colorado, Mid-continent Regional Educational Laboratory. Available at: http://www.mcrel.org:80/topics/products/83/ (accessed 2/9/08).
0.1 = percentile gain of 6 points
ie a class ranked 50th in a league table of 100 schools would move from 50th to about 44th place
0.5 = percentile gain of 20 points
ie move from 50th to 30th place
1.0 = percentile gain of 34 points
ie move from 50th to 16th place
0.2 “small” = difference in height between 15-16 year olds
0.5 “medium” = difference in height between 14 and 18 year olds
0.8 “large” = difference in height between 13 and 18 year olds
Pearson et al. 2005
Bernard et al. 2004
Hattie and Timperley, 2007
CASE (Cognitive Acceleration Through Science Education)
1. 04 CASE (Cognitive Acceleration Through Science Education) (Boys science GCSE - Adey & Shayer, 1991)
0.6 Direct instruction (Sipe & Curlette, 1997)
0.43 Homework (Hattie, 1999)
0.32 Formative assessment (KMOFAP)
0.31 ICT (Hattie, 1999)
0.1 Individualised instruction (Hattie, 1999)
Synthesis of study skills interventions
Meta-analysis of 51 studies of study skills interventions. Categorised the inverventions using the SOLO model (Biggs & Collis, 1982), classified studies into four hierarchical levels of structural complexity and as either ‘near’ or ‘far’ transfer. The results support situated cognition, and that training for other than simple mnemonic tasks should be in context, use tasks within the same domain as the target content, and promote a high degree of learner activity and metacognitive awareness.
(average effect 0.4)
Self system - metacognition - cognition/ knowledge
Self - 0.74
The “flat earth” criticism is based on Lee Cronbach’s assertion that a meta-analysis looks at the “big picture” and provides only a crude average. According to Cronbach,
“… some of our colleagues are beginning to sound like a Flat Earth Society. They tell us that the world is essentially simple: most social phenomena are adequately described by linear relations; one-parameter scaling can discover coherent variables independent of culture and population; and inconsistencies among studies of the same kind will vanish if we but amalgamate a sufficient number of studies…The Flat Earth folk seek to bury any complex hypothesis with an empirical bulldozer…” (Cronbach, 1982, in Glass, 2000).
“Of course it mixes apples and oranges; in the study of fruit, nothing else is sensible; comparing apples and oranges is the only endeavor worthy of true scientists; comparing apples to apples is trivial” (Glass, 2000).
Scatterplot of the effects from individual studies (horizontal axis) against a study size (vertical axis)
“Median effect sizes for studies with sample sizes less than 250 were two to three times as large as those of larger studies.” (Slavin & Smith, 2008)
Uses explicit rules to synthesise research findings
Can find relationships across studies which may not emerge in qualitative reviews
Does not (usually) exclude studies for methodological quality to the same degree as traditional methods
Statistical data used to determine whether relationships between constructs need clarifying
Can cope with large numbers of studies which would overwhelm traditional methods of review
EPPI, Institute of Education, London
The Campbell Collaboration
Best Evidence Encyclopedia, Johns Hopkins
Best Evidence Synthesis (BES), NZ
Institute for Effective Education (York)
Bernard, R.M., Abrami, P.C., Lou, Y., Borokhovski, E., Wade, A., Wozney, L., Wallet, P.A., Fiset, M.,& Huang, B. (2004) How Does Distance Education Compare with Classroom Instruction? A Meta-Analysis of the Empirical Literature Review of Educational Research, 74. 3, (Autumn, 2004), pp. 379-439.
Chambers, E.A. (2004). An introduction to meta-analysis with articles from the Journal of Educational Research (1992-2002). Journal of Educational Research, 98, pp 35-44.
Cronbach, L. J., Ambron, S. R., Dornbusch, S. M., Hess, R.O., Hornik, R. C., Phillips, D. C., Walker, D. F., & Weiner, S. S. (1980). Toward reform of program evaluation: Aims, methods, and institutional arrangements. San Francisco, Ca.: Jossey-Bass.
Glass, G.V. (2000). Meta-analysis at 25. Available at: http://glass.ed.asu.edu/gene/papers/meta25.html (accessed 9/9/08)
Hattie, J. A. (1992). Measuring the effects of schooling. Journal of Education, 36, pp 5-13
Hattie, J., Biggs, J. and Purdie, N. (1996) Effects of Learning Skills Interventions on Student Learning: A Meta-analysis Review of Educational Research 66.2 pp 99-136.
Hattie, J.A. (1987) Identifying the salient facets of a model of student learning: a synthesis of meta-analyses International Journal of Educational Research, 11 pp 187- 212.
Hattie, J. & Timperley, H. (2007) The Power of Feedback Review of Educational Research 77. 1, pp. 81–112.
Lipsey, Mark W., and Wilson, David B. (2001). Practical Meta-Analysis. Applied Social Research Methods Series (Vol. 49). Thousand Oaks, CA: SAGE Publications.
Marzano, R. J. (1998) A Theory-Based Meta-Analysis of Research on Instruction. Aurora, Colorado, Mid-continent Regional Educational Laboratory. Available at: http://www.mcrel.org:80/topics/products/83/ (accessed 2/9/08).
Pearson, D.P., Ferdig, R.E., Blomeyer, R.L. & Moran, J. (2005) The Effects of Technology on Reading Performance in the Middle-School Grades: A Meta-Analysis With Recommendations for Policy Naperville, Il: University of Illinois/North Central Regional Educational Laboratory .
Sipe, T. & Curlette, W.L. (1997) A Meta-Synthesis Of Factors Related To Educational Achievement: A Methodological Approach To Summarizing And Synthesizing Meta-Analyses International Journal of Educational Research 25. 7. pp. 583-698.
Slavin, R.E. and Smith, D. (2008) Effects of Sample Size on Effect Size in Systematic Reviews in Education Paper presented at the annual meetings of the Society for Research on Effective Education, Crystal City, Virginia, March 3-4, 2008.