1 / 8

Methods for Comparative Evidence Reviews September 2005

Methods for Comparative Evidence Reviews September 2005. Oregon Evidence-based Practice Center for the Drug Effectiveness Review Project Marian McDonagh, Pharm D, Project Director Mark Helfand, MD, MPH, EPC Director. Apply Systematic Review Methods to Comparative Questions

aviva
Download Presentation

Methods for Comparative Evidence Reviews September 2005

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Methods for Comparative Evidence ReviewsSeptember 2005 Oregon Evidence-based Practice Center for the Drug Effectiveness Review Project Marian McDonagh, Pharm D, Project Director Mark Helfand, MD, MPH, EPC Director

  2. Apply Systematic Review Methods to Comparative Questions Drug vs Drug within class Ensure that reviews are: Methodologically Consistent Methodologically Transparent Reports are User Friendly Lowest risk of bias in methods and researchers Health Resources Commission Remit to Oregon Evidence-based Practice Center

  3. Question 1:Effectiveness and/or Efficacy Health Outcomes preferred over intermediate outcomes Question 2: Harms and Tolerability Short-term adverse events Long-term safety Question 3: Sub-populations Age Race/ethnicity Gender Co-morbidities Inclusion Criteria based on key questions Key Question Development

  4. Each study included in review is assessed for internal validity (quality) and external validity (generalizability / applicability) Quality rated using predefined criteria Good: meets all criteria Fair: meets most criteria Poor: “fatal flaw”, combinations of criteria are not met that indicate significant risk of bias Not used in analysis Quality Assessment

  5. Study data abstracted into tables for cross-study comparison Qualitative synthesis of data Quantitative synthesis of data Meta-analysis when appropriate Overall grade allocated for the body of evidence for each key question Data Collection and Analysis

  6. Reports written with user in mind Draft reports undergo: Peer Review Public Comment AHRQ review for methods Every report includes a summary table that provides summary of evidence by Key Question Every report has a slide show Researchers make presentations to committees Available for questions relating to the evidence (only) Reports

  7. Every report is updated 6-month or yearly basis Determined by committees with input from researchers Update process starts with revisiting the Key Questions Modifications needed? New drugs? New populations/indications to consider? Update process identical to original review Updates

  8. Oregon Evidence-based Practice Center Oregon Health & Science University 3181 SW Sam Jackson Park Road Portland, OR 97239 Drug Effectiveness Review Project www.ohsu.edu/drugeffectiveness

More Related