1 / 30

Principles of Research Synthesis

Principles of Research Synthesis. San Francisco Radiation Oncology Conference February 28 to March 2, 2003. Benjamin Djulbegovic, M.D.,PhD. H. Lee Moffitt Cancer Center University of South Florida djulbebm@moffitt.usf.edu. I The need for research synthesis. The need for research synthesis.

lan
Download Presentation

Principles of Research Synthesis

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Principles of Research Synthesis San Francisco Radiation Oncology Conference February 28 to March 2, 2003 Benjamin Djulbegovic, M.D.,PhD. H. Lee Moffitt Cancer Center University of South Florida djulbebm@moffitt.usf.edu

  2. I The need for research synthesis

  3. The need for research synthesis • Health care decision makers need to access research evidence to make informed decisions on diagnosis, treatment and health care management for both individual patients and populations. • There are few important questions in health care which can be informed by consulting the result of a single empirical study.

  4. II The problems with traditional review articles

  5. The need for research synthesis • Importance of review articles • Review articles in medical journals summarize large amounts of information on a particular topic • and therefore are useful and popular source of information for health care professionals • review articles have the highest impact factor • which means, that research, practice and policy-decisions are significantly influenced by review articles

  6. Science of research synthesis: problems with traditional review articles • Personal views on the available body of evidence • Selection bias and selective citations monster • Has been pervasive in medicine, economics and social sciences • Can obscure up to 40-60% of true intervention’s effect • In 2000, Nobel prize in Economic Science was awarded to James Heckman of the University of Chicago for his analysis of selection bias, which in turn profoundly affected applied research in economics as well as in other social sciences • Lack of reproducibility • that is, the lack of the key scientific criterion

  7. Selective citation bias: blind men and elephant

  8. Critique of reviews of chemotherapy for ovarian cancer • Crx superior (by Qualitative analysis) 48/53 • Searchstrategy 3/53 • Inclusion/exclusion criteria 2/53 • Validity assessment 1/53 • Quantitative Assessment 3/53 Courtesy of Dr. C. Williams

  9. Quality of Review Article:158 articlesonly 2 met all 10 criteria. Ann Intern Med 1999;131:947-951

  10. Research Synthesis: terminology • Systematic review. The application of strategies that limit bias in the assembly, critical appraisal, and synthesis of all relevant studies on a specific topic. Meta-analysis may be, but is not necessary, used as part of this process. • Meta-analysis. The statistical synthesis of the data from separate but similar, i.e. comparable studies, leading to a quantitativ summary of the pooled results.

  11. Key Distinctions Between Narrative and Systematic Reviews, by Core Features of Such Reviews Systematic Review Core Feature Narrative Review Study question Often broad in scope Often a focused clinical question. Which databases Data sources and Comperehensive search of many were searched and search strategy databases as well as so-called gray search strategy are not typically literature. Explicit search strategy provided. Not usually specified, potentially Selection of articles for Criterion-based Selection, biased. study uniformly applied. Article review or Variable, depending on who is Rigorous critical appraisal, typically appraisal conducting the review. using a data extraction form. Some assessment of quality is almost Study quality If assessed, may not use formal always included as part of the data quality assessment. extraction process. Quantitative summary ( meta-analysis) ; Often a qualitative summary. Synthesis if the data can be appropriately pooled qualitative otherwise. Inferences Sometimes evidence-based. Usually evidence-based

  12. Principles of reliable detection of the effects of health care interventions • Methods to reduce bias • Methods to reduce statistical imprecision

  13. III Principles of systematic reviews and meta-analysis

  14. Principle #1:the need to consider the totality of evidence • “The world can be only considered as the totality of facts…for the totality of facts determines what is the case, and also whatever is not the case” L. Wittgenstein (“Tractus logico-philosophicus”), 1921

  15. Principle #2: requirement for reproducibility • Transparent, explicit and systematic approach in identifying and synthesizing evidence • Methods for search for evidence • Inclusion and exclusion criteria • Quality assessment

  16. Steps of a Systematic Review Search of Consultation personal Computerized with experts files Databases Review of Systematic reference lists manual searches of articles of key journals Identify studies Relevant Not Relevant Review for relevance Evaluate methodological quality Reject Extract data Analyze data Draw Conclusions

  17. the QUOROM statement

  18. Principles of reliable detection of the effects of health care interventions • Systematic bias must be < the effect of intervention which we are trying to detect • The need for the totality of evidence (published and unpublished) • Random errors (play of chance) must be < the effect of intervention which we are trying to detect • uncertainty/imprecision reduced by pooling all available data • Need for large number of patients/events

  19. Rationale for (quantitative) synthesis of all available evidence • The rationale for pooling data is clinical and not statistical • Similar interventions for similar conditions will produce the similar effects (i.e. in the same direction) • While the effect size may not be the same, it will rarely be in the opposite directions • Meta-analysis attempts to show direction of the effect (i.e. help establish generalisability of the effect)

  20. RR (95% CI Fixed) A) Disease population RCT2 RCT3 RCT1 Test for heterogeneity chi square; df 2, p=0.1 Test for overall effect Z; p=0.02 0.8 1 10 Relative risk Favors new treatment Favors control

  21. RR (95% CI Fixed) B) Disease population RCT2 RCT3 RCT1 Test for heterogeneity chi square; df 2, p=0.02 Test for overall effect Z; p=0.05 0.8 1 10 Relative risk Favors new treatment Favors control

  22. Calculate “observed minus expected” for each trial Treated Control Obs=10 Exp=12.5 25 Dead Obs=15 o-e = -2.5 v = 5.5 odds ratio = 0.64 Conf.Int. = 0.28-1.46 P = 0.29 Alive 175 100 100 200 Courtesy of Dr. K. Wheatley

  23. Compare only patients in one trial with patients in the same trial Statistics Obs’d – exp’d Variance Trial 1 (o – e)1 V1 Trial 2 (o – e)2 V2 Trial 3 (o – e)3 V3 All Trials (o – e)T VT Courtesy of Dr. K. Wheatley

  24. Compare only patients in one trial with patients in the same trial Statistics Obs’d – exp’d Variance Trial 1 - 2.55.5 Trial 2 - 2.55.5 …… Trial 10 - 2.55.5 All Trials - 25.055.0 Odds ratio = 0.63 95% confidence interval: 0.49 to 0.83 P<0.001

  25. Rationale for (quantitative) synthesis of all available evidence • Reduction of bias: Comparison of alike with alike • Use of randomized comparison whenever possible • Always within the same trial • Pooling is done by adding trials (not patients) • Reduction of imprecision and uncertainty • Particularly important when the effects of interventions are of small to moderate size (e.g. RRR=5-10% or 15-25%) • 20% of reduction in a 50% risk of death=avoidance of death in 1 in 10 patients

  26. Effect of random errors • Function of the size of the trial • Subgroup analysis

  27. No evidence or no evidence of an effect? • Absence of evidence of benefit is not evidence of absence of benefit • Truly negative trial (evidence of no effect) vs. false-negative trial (no evidence of an effect)

  28. Size of randomized trials in myeloma

  29. Effect of chance:data-dependent subgroup analysis vs. indirect extrapolation of overall analysis • Data-dependent subgroup analyses may result to importantly biased conclusions… and should be avoided… • Paradoxically, even effects among specific categories of patients may be best assessed indirectly by approximation of overall treatment effect to the patients into a specific category • As long as the effect the effect in the specific subgroup is not qualitatively different from the overall effect

  30. Real trial (ISIS-2): EXAGGERATEDLY POSITIVE mortality effect in a subgroup defined only by astrological “birth sign” Astrological “birth sign”Atenolol effect on day 0-1 mortality in acute myocardial infarction Mortality reduction Statistical comparing Atenlol significance with control group (2P) Leo (I.e. born beween71% + 23<0.01 July 24 & August 23) 11 other birth signsMean 24% Each > 0.1 (NS) (taken separately) Any birth sign30% + 10<0.004 (appropriate overall analysis)

More Related