1 / 27

Trials

Trials. Adrian Boyle. Objectives. Design Measures of quality How to analyse data from an RCT How to appraise an RCT. New terms. Explanatory vs pragmatic Surrogate end point Equipoise Factorial Cross-over Intention to treat Number needed to treat. Equipoise.

moss
Download Presentation

Trials

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Trials Adrian Boyle

  2. Objectives • Design • Measures of quality • How to analyse data from an RCT • How to appraise an RCT

  3. New terms • Explanatory vs pragmatic • Surrogate end point • Equipoise • Factorial • Cross-over • Intention to treat • Number needed to treat

  4. Equipoise There should be substantial uncertainty in the clinician’s mind about which treatment is better for the patient before the patient is enrolled in a trial

  5. Why randomise? • To avoid / reduce selection bias • To balance confounders across groups

  6. Limitations of trials • Internal validity high at expense of low external validity • Efficacy rather than effectiveness • Irrelevant narrow questions • Often a ‘Shot in the dark’ • Drug companies often want to compare against placebo, not standard treatment • Expensive and time consuming

  7. More jargon • Phase 1 Clinical pharmacology Drug safety in volunteers • Phase 2 Initial investigation of effect Effectiveness • Phase 3 Full scale evaluation Compared to placebo or standard practice • Phase 4 Post marketing surveillance

  8. Efficacy Frontier of effect under ideal circumstances works Effectiveness How this intervention works in ‘the real world’ Explanatory Provide clues as to how the intervention Pragmatic Shows how well the intervention works Trial design

  9. Basic trial design Population Randomisation Exposure1 Exposure 2 Outcome Outcome

  10. Analysis of basic study design • Relative risk incidence of outcome in group 1 DIVIDED BY incidence of outcome in group 2 Sounds dramatic and sexy

  11. Analysis of basic study design • Absolute risk reduction: incidence of outcome in group 1 MINUS incidence of outcome in group 2 • Less sexy and more useful. • Ask the next drug rep. • Enjoy

  12. Analysis of basic study design • Number needed to treat • Inverse of the absolute risk reduction

  13. Example • A trial of drug A compared to drug B found that 20 out the 50 people who received A were alive at one year compared to 15 /50 who received drug B • What is the relative risk? • What is the absolute risk? • What is the NNT

  14. Example • Relative risk • 20/50 = 0.4 • 15/50 = 0.3 • 0.4/0.3 = 1.33 • Or this could be expressed as a 33% increase in survival at one year

  15. NNT • Absolute risk reduction 0.4-0.3 = 0.1 • This could be expressed as a risk reduction of 10% NNT 1/0.1 = 10 • That is, you need to treat 10 people to save one life in one year

  16. Multi-variate analysis • Adjust for potential confounders and bias • Usually with logistic regression expressed as a odds ratio

  17. Factorial trial design Population Randomisation Exposure1 Placebo Placebo Exposure2 Exposure2 Exposure1 Placebo Placebo Outcome Outcome Outcome Outcome

  18. Cross over design Population Randomisation Exposure1 Exposure2 Outcome Outcome Exposure1 Exposure2 Outcome Outcome

  19. Cluster The unit of randomisation is a group of individuals e.g. GP practices or hospitals Easier implementation of a complex design Large studies Seriously complicated statistics

  20. Randomisation Many methods • Block • Stratification • Weighted Simple The proof of the pudding is in the table 1

  21. Sources of bias in a trial • Selection • Performance • Losses to follow up • Detection

  22. What bias is there here? Population 200 Randomisation Exposure 1 Exposure 2 100 100 15 Outcome Outcome 12 10

  23. Is this bias? Population 2000 Randomisation Exposure1 Exposure 2 100 100 Outcome Outcome 50 10

  24. What bias is there here? Population 200 Randomisation Exposure1 Exposure 2 100 100 Outcome No Outcome 60 No Outcome 80 Outcome 20 10

  25. Outcome measure • Does the outcome mean anything to you? • Beware surrogate outcomes • Beware composite outcome especially if industry funded

  26. Appraising a trial • Identify aims • Identify study design • Identify population, exposure and outcome • Consider the randomisation • Consider the blinding • Consider the measurement • What biases and confounding factors are there? • What is the result and what does it mean

  27. Assessing strength / quality • JADAD scoring • http://www.naturalstandard.org/explanation_columns.html

More Related