1 / 20

IOM Standards for Systematic Reviews: Finding and Assessing Individual Studies

Eduardo Ortiz, M.D., M.P.H. National Heart, Lung, and Blood Institute National Institutes of Health May 10, 2011. IOM Standards for Systematic Reviews: Finding and Assessing Individual Studies. Acknowledgements. Colleagues at the Division for the Application of Research Discoveries at NHLBI:

kenley
Download Presentation

IOM Standards for Systematic Reviews: Finding and Assessing Individual Studies

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Eduardo Ortiz, M.D., M.P.H.National Heart, Lung, and Blood InstituteNational Institutes of HealthMay 10, 2011 IOM Standards for Systematic Reviews: Finding and Assessing Individual Studies

  2. Acknowledgements Colleagues at the Division for the Application of Research Discoveries at NHLBI: Denise Simons-Morton, MD, PhD (Division Director) Glen Bennett, MS Janet de Jesus, MS, RD Karen Donato, SM, RD Rob Fulwood, PhD, MSPH Edward Donnell Ivy, MD Chi Onyewu, MD, PhD Susan Shero, MS, RN Joylene John-Sowah, MD Sid Smith, MD Zhi-Jie Zheng, MD, PhD

  3. NHLBI Cardiovascular Clinical Guidelines 3

  4. Other NHLBI Clinical Guidelines 4

  5. Current Guideline Efforts • High Blood Pressure • Cholesterol • Overweight/Obesity • Adult Cardiovascular Risk Reduction • Crosscutting Work Groups to support our Adult CVD Guideline efforts: • Lifestyle, Risk Assessment, Implementation • Pediatric Cardiovascular Risk Reduction • Sickle Cell Disease

  6. Guidelines Development Process Create Evidence Tables; Grade Body of Evidence Topic Area Identification Evidence Statements and Recommendations External Review with Revisions as Needed Expert Panel Selection Screening Data Abstraction Study Quality Grading Dissemination Implementation Evaluation Analytic Models and Critical Questions Literature Search 6

  7. Evidence-Based Systematic Review Process Step 1 – Develop analytic framework and critical questions Step 2 – Establish inclusion & exclusion criteria Step 3 – Screen titles, abstracts, and full text to identify relevant studies for inclusion Step 4 – Rate the quality of each individual study Step 5 – Abstract data for studies rated good or fair Step 6 – Create evidence tables Step 7 – Create summary tables Step 8 – Create narrative summary Step 9 – Rate the quality of the overall body of evidence for each critical question 7 7 7

  8. Developing the Recommendations Step 10 – Review the evidence Step 11 – Develop evidence statements, including rating the quality of evidence for each statement Step 12 – Develop recommendations, including grading the recommendations and making sure they are supported by the evidence Step 13 – GLIA assessment of recommendations Step 14 – Public comment period, with invitations for review Step 15 – Review comments and revise recommendations Step 16 – Final recommendations Step 17 – Dissemination, implementation, and evaluation 8 8 8

  9. Developing the Critical Questions • Critical questions are developed by the expert panels working collaboratively with the methodology team and NHLBI leads • PICOTS format • Predefined inclusion and exclusion criteria

  10. Screening • Each study is screened by two independent reviewers using the I/E criteria • Review titles and abstracts, followed by full text • If they disagree, they discuss and try to reach consensus • If they do not achieve consensus or need additional input, there is 3rd party adjudication • Panel members can appeal a decision • Re-assessed and adjudicated by a 3rd party, but the panel cannot override a decision made by the reviewers.

  11. Lessons Learned and Challenges • Issues: • Sometimes you get the prespecified I/E criteria wrong and have to be practical and make post-hoc adjustments • Despite your best efforts, it is sometimes difficult to figure out whether a study should be included or excluded

  12. Rating the Evidence • Quality of each included study is rated by two trained independent reviewers at the time of data abstraction. • No satisfactory tools were available for assessing study quality, so we developed our own • Controlled intervention studies, cohort studies, case control studies, and systematic reviews / meta-analyses • Despite your best efforts, it is sometimes difficult to determine the quality rating of an individual study • Panel members can appeal a decision • Re-assessed and adjudicated by a 3rd party, but the panel cannot override a decision made by the reviewers.

  13. Lessons Learned and Challenges • We used one abstractor and a reviewer to check the abstracted data for accuracy and completeness • Consistent with the Buscemi study, we experienced a substantial number of errors • Dependent on the individual abstractor and reviewer • Takes a lot of time and can create a bottleneck • Using two independent abstractors would not have been feasible

  14. Rating the Evidence • Overall body of evidence is rated for each critical question using a standardized rating instrument • We reviewed GRADE, USPSTF, ACC-AHA, and many other systems • Hybrid model similar to USPSTF • High, Moderate, Low • We need a better, standardized, user-friendly approach to rating evidence and grading the strength of recommendations

  15. Lessons Learned and Challenges • For guidelines, you need to answer many questions • Conducting multiple systematic reviews to answer these questions is very challenging and requires a lot of time, effort, expertise, manpower, and money • How do we develop high-quality, credible systematic reviews to support our guidelines that can be completed and updated in a timely manner? Is there a sweet spot to aim for between evidence-based rigor and practicality?

  16. Lessons Learned and Challenges • Standard 3.2 – Take action to address potentially biased reporting of research results • Grey literature and other sources of unpublished studies • Contacting researchers and asking them to clarify study related information • Asking study sponsors to submit unpublished data • Conference abstracts • Studies not published in English • It takes a lot of time and effort to search the published literature and conduct all the other steps in the SR process without searching all these additional resources • Some of these issues hopefully will be addressed by investigators and publishers, as it is not realistic to expect most groups developing SRs to be able to do all this

  17. Lessons Learned and Challenges • Updating searches during and after the review process • Very important, but once again practical considerations come into play • A considerable amount of time can lapse between the initial search and completion of the review • If you have to go back and update the material, it is not just the search that has to be repeated, but all the other steps in the EB review process, which then takes more time and can lead to a vicious cycle. • How do we deal with this from a practical perspective, yet maintain a reasonable time frame and budget?

  18. Lessons Learned and Challenges • Lots of personnel, time, effort and costs to screen studies for inclusion/exclusion, assess study quality, adjudicate decisions, abstract data, create evidence tables and summaries, etc. • If you don’t have all the needed expertise to do this in-house, contracting out the work can be challenging • High costs, coordination, and decision-making across organizations and individuals • Getting screeners, reviewers, and abstractors that have enough methodological expertise and clinical knowledge to understand important contextual issues and other nuances can be a challenge, especially when conducting multiple SRs • Variability in the quality of the reviewers and methodologists • Reviewers and methodologists can differ in their knowledge, perspectives, biases, attention to detail, etc., so the quality of reviews can vary, depending on the individuals

  19. Final Comments • Support the IOM recommendations • Hopefully will improve the quality and consistency of systematic reviews • Report is comprehensive and represents an ideal process, but it would have been helpful to provide more practical or “at a minimum” recommendations to factor in real-world limitations facing most organizations • Stronger linkage between the SR and CPG reports, including more practical recommendations to assist those of us who conduct SRs for the purpose of developing guidelines

  20. Thank you! Eduardo Ortiz, M.D., M.P.H. NHLBI Guideline Information: http://www.nhlbi.nih.gov/guidelines/index.htm

More Related