1 / 18

A Meta-Analysis Primer

Rick Chappell - 641, 12/15/2010. A Meta-Analysis Primer. Outline. I. Definition II. Example 1 - Cervical Carcinoma at UW III. How to Numerically Summarize Evidence IV. Examples of Presentation V. Reasons for Meta-Analysis VI. Problems with Meta-Analysis. I. Definition.

trilby
Download Presentation

A Meta-Analysis Primer

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Rick Chappell - 641, 12/15/2010 A Meta-Analysis Primer

  2. Outline I. Definition II. Example 1 - Cervical Carcinoma at UW III. How to Numerically Summarize Evidence IV. Examples of Presentation V. Reasons for Meta-Analysis VI. Problems with Meta-Analysis

  3. I. Definition • “Meta-analysis is a relatively new method for reviewing and combining results from multiple clinical trials. Whereas other review methods usually involve narrative discussions of individual trials, a meta-analysis … systematically combines and evaluates the results of clinical trials that have been completed.” - Spilker, Guide to Clinical Trials, 1991, Chapter 104

  4. A “systematic review” is: • Numeric • Complete • Often a separate publication • Can be used in the Discussion of a trial publication to compare its results with others’ and to summarize evidence

  5. II. Example 1 - Cervical Ca at UW • Does treatment prolongation change outcome? • If so, among which subgroups of patients? (Example presented on Wed. 12/8)

  6. Moral: the scientific method is not a democratic process. Meta-analysis does not consist of “voting” for the most popular result. P-values do not provide a proper summary of evidence - estimates do.

  7. III. How to Numerically Summarize Evidence A. Continuous Outcomes - Difference in Means (usually) or medians. • Means are easy to combine: averaging means gives you a new mean (unlike with medians) • E.g., mean income is the average of mean male and mean female incomes (average may need to be weighted if # of males isn’t equal to # of females).

  8. B. Time to Event (“Survival”) Outcomes - Hazard Ratio (HR) HR = (Probability of event in a short time period for a ‘treated’ patient) / (Probability of event in a short time period for an ‘untreated’ patient). • Sometimes also called relative risk. • HRs may are often constant over a wide range of risk levels.

  9. C. Binary Outcomes - Odds Ratio (OR) Odds = (Probability of event) / (Probability of no event) OR of event due to ‘treatment’ = (Odds of event given ‘treatment’) / (Odds when not given ‘treatment’)

  10. Why are odds ratios commonly used? • They are useful in case-control studies (e.g., the odds of being an asbestos miner if you have mesothelioma are the same as the odds of having mesothelioma given you are an asbestos minor). This isn’t the reason here. • Statistical Reasons - ORs are easy to combine. • Like hazard ratios, ORs are often constant over a wide range of risk levels.

  11. IV. Examples of Presentation • Table of “O - E” (observed - expected in the ‘treatment’ group = excess incidence in that cohort). These can be added up. • Plot of trial # vs. odds ratio • Both combined: “Forest plot”

  12. Presentation ofMeta-analysis: Example • Clahsen, et al. JCO115. pp. 2526-2535 (1997). Perioperative polychemotherapy (PeCT) in breast cancer. • Forest plot of hazard ratio (PeCT vs. no PeCT) of disease-free survival.

  13. Example 3 - Berry, et al. (JCO, to appear)

  14. V. Reasons for Meta-Analysis(Armitage & Colton, Enc. of Biostatistics Vol. 4, 1999) • Narrative Reviews can be distorted and misleading. • Explosion of research evidence cannot be easily assimilated without a formal review. • Since individual CTs’ sample sizes may be too small to reliably detect clinically important effects, synthesis is necessary. Meta-analysis provides statistical power.

  15. VI. Problems with Meta-Analysis • Intertrial heterogeneity (see Table 104.1 of Spilker). • “Garbage in garbage out.” • Publication bias - any relation between the direction of study’s results and their dissemination.

  16. Publication Bias • E.g., ORG-2766 protected nerves from cytotoxic injury in 55 women with ovarian cancer - NEJM lead article (van der Hoop, et al., 1990); a subsequent negative study of 133 women - ASCO Proceedings abstract (Neijt, et al., 1994). • Vickers, et al. (1998) show that the problem is widespread: in some countries, 100% of publications show treatment effects.

  17. An attempted Cure - The Cochrane Collaboration • An international organization which maintains complete registries and prepares reviews of research on clinical interventions. • Home page: www.cochrane.org or www.cochrane.de • Cochrane library: www.cochrane.co.uk

More Related