1 / 35

Setting the Scene

Setting the Scene. Dr Andrew Walker. Contents. SMC Process (very briefly) Analysis of decisions to date, emphasising economics aspects Quality of economics evidence Issues for economic models Issues for sensitivity analysis. SMC process. New medicines and major new indications

owen
Download Presentation

Setting the Scene

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Setting the Scene Dr Andrew Walker

  2. Contents • SMC Process (very briefly) • Analysis of decisions to date, emphasising economics aspects • Quality of economics evidence • Issues for economic models • Issues for sensitivity analysis

  3. SMC process • New medicines and major new indications • Onus on manufacturer to submit evidence • Forms and guidance on website • Sections on • Clinical trials • Safety • Effectiveness • Economics evidence • Budget impact

  4. Upon receipt … • Assigned to meeting and reviewer • Checklist for economics • Access to clinical experts, statistician, etc. • With pharmacist, joint recommendation • Goes to New Drugs Committee … • … their draft rec. to manufacturer … • SMC sees NDC view, manufacturer view, patient group view

  5. NICE (HTA (medicines (Scotland)))? • Selection of topics • Timescales • Resources • Nature of guidance • Processes (e.g. consultation) • Tasks undertaken • Possibly: burden of proof

  6. Submissions 24% no economic evaluation 1% cost-consequence analysis 27% cost-effectiveness analysis 13% cost impact 4% cost-minimisation analysis 30% cost-utility analysis

  7. How many trees died? • Recommended limit is 50 but exactly half the submissions received exceed that • Average submission is 56 pages long (including economics appendices) • Average econ section is 15 pages • Econ is typically 24% of total • Largest submission 236 pages

  8. Guidance issued • Just over 100 pieces of guidance issued • www.scottishmedicines.org/medicines/default.asp • Acceptance rate about 70% • Falling very slightly over time • Goes to NHS boards and Area D&T Cmts • Sticking to timescales

  9. NICE overlap • 12 products reviewed by both groups, every decision has been the same • Accepted: peg interferon for hep C, imatinib for CML, rituximab for NHL • Not acepted: anakinra for RA’ pimecrolimus for eczema (still provisional?) • Typically SMC announcement ten months earlier • A quarter of the duration and a tenth of the cost!

  10. PBAC overlap

  11. What helps or hinders? Medicines do slightly better if: • Mainly for use in for secondary care • Potential to save life / improve compliance • 57th in class • New version of a medicine (e.g. pegylated) • Manufacturer is “frequent submitter” But NONE of these are deliberate

  12. Role of economics at SMC Econ view supported 37% of submissions • When econ view is supportive, 92% were accepted • When econ view is not supportive, 56% were accepted When SMC guidance is “not recommended” the econ view was a factor in 90% of cases

  13. So what does matter? • Quality of case submitted and whether it is fit for purpose • … er … • That’s it really.

  14. The SMC scepticism index • Almost every submission has some flaws • Not every flaw damns the submission • Flaws act cumulatively and probably interact as well to produce scepticism • At some point the evidence submitted is perceived to be so flawed that it can no longer be taken as offering convincing support • “The economic case has not been demonstrated”

  15. How do you define quality? • Guidance on our website, FAQs, checklist • Keeps fairly close to NICE guidance • Today is about starting to think how we take that forward in two key areas: • Economic models • Sensitivity analysis • So what’s the problem?

  16. 3 roles for models Economic models perform three roles: • Manufacturer constructs model to estimate costs and benefits beyond RCT follow-up • Partial or complete departure from RCT, model estimates all costs and benefits • A few key clinical results are used in a pre-existing model

  17. The Norwegian Blue problem • Basic design: • Choice of comparator for economic evaluation • RCT data that convinces SMC clinicians • Generalisable measure of health benefit • Can clever stats ever conceal a big design flaw? • Stirling or Kilmarnock?

  18. Thumbnail of submission • “Markov model as basis for CUA”, time horizon for model, cycle length • Justified? • Diagram, data inputs – table and refs. • Relevance to Scotland 2004? • Is the model valid? (section missing???) • Key results and ICER • Can we really see what’s going on?

  19. Key issues for modelling • Data inputs (but not for today) • How can we make the predictions of the model as transparent as possible? • How do we know if the model is valid or not? • What is it reasonable to expect a manufacturer to have available at launch?

  20. Transparency

  21. What is validity? • Mount Hood challenge as the ideal • Reality: model constructed by one economist, with more-or-less plausible data (including RCTs with one year follow-up), predicting “safe” cost per QALY over 25 years • The ideal is not feasible … • … but what is?

  22. Calling John Hutton … • John will address this issue in a minute, but we have had submissions which: • Ignored trial data, just used focus group to estimate likely effects • RCT data collection was poor, so started again with a model • Took one key result and applied it in a pre-existing model

  23. Sensitivity analysis • About a third of econ submissions now have PSA, visual aspect is very helpful • Issues: • If it is possible to “spin” a PSA, how can we critique it to spot this? • How do we interpret the results? • Should PSA be a requirement or does it have a more limited role? Or are there other options?

  24. Calling Andy and Adrian … What evidence should the manufacturer submit to support: • choice of range for a variable? • form of distribution? • number of simulations? • choice of ICER for “threshold”? • Interpretation of CEAC, NHB, etc.?

  25. Summary • Process and guidance • What matters (and what doesn’t) • Quality is key, not trying to deal with all aspects of that today • Focus is on: • Models: transparency and validity • PSA: role and/or niche

More Related