1 / 54

Principles of Comparative Effectiveness Research

Principles of Comparative Effectiveness Research. Mark Helfand, MD Director, Oregon EPC http://www.ohsu.edu/epc/. Outline. Antecedents Comparative effectiveness reviews Principles for CE research Applying the principles Methods research agenda.

gamada
Download Presentation

Principles of Comparative Effectiveness Research

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Principles of Comparative Effectiveness Research Mark Helfand, MD Director, Oregon EPC http://www.ohsu.edu/epc/

  2. Outline • Antecedents • Comparative effectiveness reviews • Principles for CE research • Applying the principles • Methods research agenda

  3. Table published in the third edition of Florence Nightingale's Notes on Hospitals [9] Iezzoni, L. I. Ann Intern Med 1996;124:1079-1085

  4. Percent Mortality for Inpatients1863 N Setting % • London hospitals 91% • Hospitals in large towns 83% • County hospitals 39% • Other hospitals 40% • Naval & military hospitals 16% 1 Margate infirmary 13% Florence Nightingale: Measuring Hospital Care Outcomes by D. Neuhauser. Joint Commission on Accreditation of Health Care Organization, 1999, 260 pages, ISBN 0 866 88559 5.

  5. Age-adjusted Rate Per 10,000 population 90 50 10 Tonsillectomy Hys Prost Chol Appy Hernia

  6. How sure are we?Expert estimates of breast implant rupture rates 0% 0.2% 0.5% 1% 1% 1% 1.5% 2% 3% 3% 4% 5% 5% 5% 5% 5% 5% 5% 5% 6% 6% 6% 8% 10% 10% 10% 10% 13% 13% 15% 15% 18% 20% 20% 20% 25% 25% 25% 30% 30% 40% 50% 50% 50% 62% 70% 73% 75% 75% 75% 75% 80% 80% 80% 80% 80% 80% 100% Source: Dr. David Eddy

  7. Experts’ estimates of probability of acute retention in men with BPH Source: Dr. David Eddy

  8. 0% 25% 50% 75% 100% Experts estimates of the effect of colon cancer screening on chance of dying Source: Dr. David Eddy

  9. Summary • Practice and outcomes Vary (WENNBERG) • Clinicians are not very good at probabilities (Eddy, Tversky) • They may cite literature selectively or inaccurately • They even make logical errors in medical thinking, including • “argument from authority” and “post hoc” reasoning • In making recommendations to a patient, they may ignore • important information, such as what the patient values • (McNeil) • Clinical research was often poorly conceived (Feinstein) and • lacked relevance to everyday practice (Fry)

  10. 1990’s “Evidence-based Medicine” Outcomes Research 1994 OTA report S. 580, the "Healthcare Research and Quality Act of 1999”

  11. Outline • Antecedents • Comparative effectiveness reviews • Principles for CE research • Applying the principles • Methods research agenda

  12. 2000’s 2001- Oregon’s Practitioner-Managed Prescription Drug Plan (PMPDP) 2004- AHRQ’s Effective Health Care Program

  13. What is the kind and strength of the evidence you are relying on to make a recommendation? The Question:

  14. What does evidence-based mean? • A comprehensive, systematic, open-minded review of all the evidence • The evidence determines the conclusion, not vice versa • Not, the citation of papers supporting a preformed conclusion (and trashing of those that don’t) • Not, the use of evidence when it is ‘positive’ but judgement when it isn’t

  15. Systematic literature reviews • Are systematic to remove bias in finding and reviewing the literature.

  16. Systematic literature reviews • Are systematic to remove bias in finding and reviewing the literature. • Experts may interpret the data (and their own experience) differently.

  17. Systematic literature reviews • Are systematic to remove bias in finding and reviewing the literature. • Studies with disappointing results may get less attention

  18. *Excludes 5 mg bid group

  19. Systematic literature reviews • Are systematic to remove bias in finding and reviewing the literature. • Critical details may be unavailable.

  20. Trial 114

  21. Included Drugs Clozapine not posted risperidone (1993) not posted olanzapine (1996) not posted quetiapine (1997) not posted ziprasidone (2001) posted aripiprazole (2002) posted

  22. Systematic literature reviews • Are systematic to remove bias in finding and reviewing the literature. • Experts may underplay controversy or select only supportive evidence

  23. Simpson et al, 2004

  24. Simpson et al, 2004

  25. In a double-blind study vs risperidone… GEODON sustained control of positive symptoms at 1 year

  26. Systematic literature reviews • Are systematic to remove bias in finding and reviewing the literature. • Experts may underplay controversy or select only supportive evidence • Emphasize the best evidence

  27. The best evidence • addresses health outcomes rather than intermediate outcomes. A health outcomes is something a person can feel or experience (such evidence is called “direct”)

  28. A Lipid-lowering drugs Lipid lowering Angiographic results B Heart attacks Congestive heart failure Strokes C Mortality Function Quality of life

  29. The best evidence • addresses health outcomes and not just intermediate outcomes • includes the spectrum of patients to whom a drug will be prescribed or test will be needed in, not just highly selected patients in research studies.

  30. The best evidence • addresses health outcomes and not just intermediate outcomes • includes the spectrum of patients to whom a drug will be prescribed • considers the potential harms as well as the benefits of the intervention being considered.

  31. The best evidence • addresses health outcomes and not just intermediate outcomes • is from “real” patients like ours, not just highly selected patients in studies. • considers the potential harms as well as the benefits of the intervention being considered. • is from well-designed, well-conducted studies.

  32. Systematic literature reviews • Define the strengths and limits of the evidence. • Clarify what is based on evidence and what is based on other grounds. • Do not necessarily tell you what to do when the evidence is limited. Other factors, such as equity, judgment, values, and preferences play a role in using the evidence. • In fact, the evidence base is usually inadequate to inform good decisions.

  33. An evidence-based decision process Makes use of an independent, systematic review of the evidence Employs rules for linking evidence to recommendations Produce explicit, defensible recommendations

  34. Oregon ApproachWhat were we after? • Systematic drug-class reviews should address questions that reflect clinicians’ and patients’ concerns. • Decision-makers should begin to wrestle with the idea of what is good evidence. • Manufacturers should gain market share if they produce good evidence of superiority over other drugs in a class. • Patients, caregivers, payers should demand better evidence about outcomes that matter.

  35. Oregon Approach • An evidence-based process, not just systematic reviews • a process for selecting and refining questions that puts providers’ and patients’ concerns center stage

  36. Selecting questions • Researchers often use their own curiosity or research interest as the basis for selecting questions. • This can introduce bias into a study or a review.

  37. Selecting questions • Our premise is that important questions arise from practice, and from life. “Experts in practice”--and patients--select the populations, interventions, and outcome measures of interest.

  38. Selecting Questions • by using citizen panels, our process for selecting and refining questions puts providers’ and patients’ concerns center stage • the process illustrates how the evidence people need to make decisions and the evidence researchers provide is often a mismatch

  39. This process • Defines the populations, interventions, outcomes for the systematic review • Distinguishes health outcomes from intermediate outcomes • Identifies what types of studies will be considered suitable to answer the questions.

  40. 2000’s 2001- Oregon’s Practitioner-Managed Prescription Drug Plan (PMPDP) 2004- AHRQ’s Effective Health Care Program 2008- Knowing What Works ARRA IOM panel, FCC

  41. CER Key Characteristics The objective is to inform decisions Compares at least 2 alternatives, each with potential to be best practice Analysis at the individual and group levels Measure outcomes important to patients (both benefits AND harms) Conducted in real world settings

  42. Principles • Emphasize getting the questions right • Start with open-minded inquiry, not beating others on the head • Patient-centered: Anyone can nominate but formulating the questions must be broad-based • High standards regarding conflicts • Collaborate with policy makers but maintain separate identities

  43. Principles • Decision-makers should wrestle with what is good evidence • Clinicians should have high standards for evidence, while taking into account risk attitude and preferences • Market share should be determined by genuine promise and demonstrated value

  44. Funding CER • ARRA • CTSAs infrastructure training • PCORI infrastructure training

  45. Research Priorities:Improve methods for Involving patients and the public Evaluating the role of observational studies Increasing the efficiency of trials Addressing heterogeneity of treatment effects within studies Incorporating preferences, values, and individual biological differences into the design of clinical research studies

More Related