1 / 22

Comparative Effectiveness Research : Rethinking Therapeutic Evaluation in Chronic Diseases

Comparative Effectiveness Research : Rethinking Therapeutic Evaluation in Chronic Diseases. Ph Ravaud. Therapeutic Evaluation of Chronic diseases. Today : mainly RCts and Meta-Analysis (one drug ) Tomorrow RCTs

mspivey
Download Presentation

Comparative Effectiveness Research : Rethinking Therapeutic Evaluation in Chronic Diseases

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Comparative EffectivenessResearch :RethinkingTherapeutic Evaluation in ChronicDiseases Ph Ravaud

  2. Therapeutic Evaluation of Chronicdiseases • Today : mainly RCts and Meta-Analysis (one drug ) • Tomorrow • RCTs • Meta-analysis and Network Meta-analysis (all drugs for a specific disease) • Observational Data

  3. TheClinical Trials System is broken • Too slow • Too expensive • Doesn’t answer many critical questions ( or doesn’t answer questions relevant for physicians • ( Short term , inadequate comparator, side effects) • Otherwise its great ( RCTs are the best way to obtain groups of patients comparable for known and unknown prognostic factors )

  4. Applicability or generalizability of trials • Patients included are not representative of the patients treated in usual care (trial patients are younger and with less co-morbidities) • Setting is not representative (centers are highly selected) • Treatments are evaluated mainly according to the principle “one size fits all”

  5. CriticalKnowledge Gaps • The paradox - 18,000 RCTs published each year - more than 350,000 RCTs available • Despite that available evidence remains limited or of poor quality

  6. Much of Care Today is Not Based on Scientific Evidence Less than 20% of AHA/ACC heart disease management recommendations are based on a high level of evidence and over 40% are based on the lowest level of evidence AND proportion of recommendations with high evidence levels has not increased over time Robert Califf, IOM Meeting on Evidence-based Medicine, December 2007

  7. From Meta-analysis to Network Meta-analysis All treatments available for a disease One treatment Less than 50/y Thousands/y

  8. Intervention D Network meta-analysis Intervention A Intervention C Intervention B Intervention E Intervention F Combining direct and indirect evidence

  9. BiologicTreatment in RheumatoidArthritis: Ongoing Trials • Only 5 head to Head trials • Ongoing Trials recruiting patients failing to respond to Methotrexate and with high disease activity

  10. Cost of RCTs: an example • 18,000 patients • Total crf pages 1.8 millions • Total crf variables 2.5 Billions • Total number of queries 600,000 • Cost 700 millions Euros • Treatment effects decrease over time , number of patients required mecanically increase

  11. Unrealistic to expect head-to-head RCTs addressing all 2-by-2 comparisons

  12. As much as we all love randomized effectiveness trials • It is an unrealistic expectation that we will have randomized trials for every intervention and its combinations in every patient subgroup ( for example if for a diseasewe have 20 differenttreatment options and 3 differentsubgroups of patients, weneedtheoriticallyat least 470 head to head trials !) • We need Effectiveness evidence in a timely manner. Randomized trials take time to conduct • Therefore, 85% of the CER evidence is from non-experimental data!* * Academy Health Report June 2009

  13. Transparency of clinical trials • Reporting guidelines • Clinicaltrial.gov 2005 • FDA amendment act 2007

  14. Consort Extensions • Too many extensions • Too many reporting guidelines • Editors do not really implement the guidelines • Quality of reporting remains poor

  15. Trial registration mandatorysince 2005( International Committee of Medical Journal Editors)

  16. Impact of dissemination bias Separate meta-analyses of the FDA and journal data sets show that the increase in effect size ranged from 11 to 69% for individual drugs and was 32% overall Turner et al. NEJM 2008 16

  17. SelectiveDisseminationBias

  18. FDA Amendment Act • US Federal law enacted in 2007 mandates registration and results reporting for clinical trials of drugs , biological products and devices at clinicaltrial.gov • Study sponsors or PI are requires to report summary results information within 1 year of completing data collection for teh prespecified primary outcome

  19. Levels of “Transparency” 21 Zarin DA, Tse T.. Science. 2008 Mar 7;319(5868):1340-2.

More Related