1 / 63

Teaching EBM in homeopathic doses

Teaching EBM in homeopathic doses. Amanda Burls Oxford, September 2005. Amanda Burls . C ritical A ppraisal S kills P rogramme making sense of evidence. General origins: The need to be evidence-based. Wide variations in practice Continued use of ineffective treatments

petra
Download Presentation

Teaching EBM in homeopathic doses

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Teaching EBM in homeopathic doses Amanda Burls Oxford, September 2005 Amanda Burls 

  2. Critical Appraisal Skills Programmemaking sense of evidence

  3. General origins:The need to be evidence-based • Wide variations in practice • Continued use of ineffective treatments • Excess use of inappropriate treatments • Poor uptake of effective practice • Increasing consumerism • Increasing demand on resources • Exponential growth in research evidence

  4. Local origins: • 1980s – Getting Research into Practice Project (GRiPP) • 1990 – Evaluation of barriers  Lack of understanding of need for evidence-based practice amongst managers and policy makers • 1992 – Critical appraisal workshops piloted • 1993 – Critical Appraisal Skills Programme • 1994 – Finding the Evidence Workshops

  5. Typical CASP workshop format • 1/2 day workshops • Structure Introductory talk Small group work Plenary discussion • Problem-based, using real papers on, e.g. Randomised controlled trials Systematic reviews Evaluations of diagnostic tests Economic evaluations Qualitative studies

  6. Philosophy • Problem based • Interactive/participative • Small group learning • Building on people’s own knowledge and experience • Fun and safe • Learning to recognise, admit and cope with uncertainty • Multidisciplinary • Where possible in peoples’ workplace • Cascading skills

  7. Traditional teaching environment

  8. Small group learning Using and building on people’s own experience, skills and knowlege

  9. Negative learning experiences

  10. CASP learning experience Safe learning environment

  11. Fun Problem

  12. Cascading

  13. You can do it too!

  14. A helping hand while finding one‘s feet

  15. Teaching EBM in homeopathic doses Amanda Burls Oxford, September 2005

  16. Less is more

  17. Models of education 1 • Filling the empty vessel

  18. Models of education 2 Learning is a natural process Teaching is facilitation of growth

  19. Models of education 2 • Learning is a natural process • Teaching is facilitation of growth

  20. Why am I asking this question?

  21. Models of education • Learning is a natural process • Teaching is facilitation of growth

  22. The “dibber”

  23. The dibber

  24. What do you think health care practitioners need to know about ODDS RATIOS • Minimum that is useful • Optimal • Please feel free to discuss with your neighbours

  25. Teaching p-values DIB: • Create the need for the concept of the p-value by getting participants to reject spontaneously a ridiculously small study

  26. RCT: Well conducted no bias • 5 people with backache received Potters • 5 people received placebo • 4 out of 5 with Potters got better • 2 out of 5 with placebo got better

  27. Participants are not convinced…“It could have happened by chance!” • So how many would you want before you believe the results? • 10 in each arm? • 20? • 100?

  28. It could have happened by chance and nothing was really going on Uumm..... The “Null Hypothesis”

  29. The p-value • What does a p-value of 5 tell us? • Use with great care…

  30. 1 0 So what does p=0.5 mean? So what does p=0.1 mean? So what does p=0.05 mean? Absolutely certain Impossible

  31. Before I show the homeopathic dose of confidence intervals, let´s explore your views…

  32. Clifton 1993 Clifton 1992 Hirayama 1994 Marion 1997 Total (95%CI) .1 .2 1 5 10 Hypothermia vs. control In severe head injury Mortality or incapacity (n=158) RR 0.63 (0.46, 0.87) RR

  33. Clifton 1993 Clifton 1992 Hirayama 1994 Marion 1997 Total (95%CI) .1 .2 1 5 10 Hypothermia vs. control In severe head injury Mortality or incapacity (n=158) RR 0.63 (0.46, 0.87) Favours intervention RR Favours control

  34. Confidence intervals are based on the assumption that a study provides one sample of observations out of many possible samples that would be derived if the study were repeated many times. For a 95% confidence interval, if the experiment were repeated many times, 95% of the intervals would contain the true treatment effect. The confidence interval (CI) is the range within which the true size of effect (never known exactly) lies, with a given degree of assurance. People often speak of a “95% confidence interval” (or “95% confidence limits”), this is the interval which includes the true value with 95% certainty. Confidence IntervalsWhich definition is better?

  35. Clifton 1993 Clifton 1992 Hirayama 1994 Marion 1997 Total (95%CI) .1 .2 1 5 10 Hypothermia vs. control In severe head injury Mortality or incapacity (n=158) RR 0.63 (0.46, 0.87) Favours intervention RR Favours control

  36. Moral: Any observed diffference between two groups, no matter how small, can be made to be “statistically significant” - at any level of significance - by taking a sufficiently large sample.

  37. Teaching confidence intervals The Modified Franks Method (with Jose´s help)

  38. Confidence intervals

More Related