1 / 27

Introduction to ‘EBSI’

Introduction to ‘EBSI’. Methodologies for a new era summer school School of Applied Social Studies, University College Cork 20 June 2011 Dr Paul Montgomery. Why Evidence-Based Social Intervention ?. Why practice needs sound evidence-base ethical imperative to do more good than harm

dacey
Download Presentation

Introduction to ‘EBSI’

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Introduction to ‘EBSI’ Methodologies for a new era summer school School of Applied Social Studies, University College Cork 20 June 2011 Dr Paul Montgomery

  2. Why Evidence-Based SocialIntervention? • Why practice needs sound evidence-base • ethical imperative to do more good than harm • best use of limited resources • wide variation in practice

  3. What is EBP? • “Evidence-Based Practice” popular term, but what does it mean - good quality evidence? • Brief history • Challenges for EBP; (critiques of EBP)

  4. Why is it important to base practice on good quality research evidence? If we want to intervene (interfere) in people’s lives, and spend large sums of money doing this, then we have an ethical duty to show that we are basing our interventions on the very best possible available evidence. If not, we may at best be wasting the precious time, money and hopes of vulnerable clients At worst we may be doing more harm than good.

  5. Evidence Based Practice:a definition. “ the conscientious, explicit and judicious use of best currently available evidence, integrated with client values and professional expertise, in making decisions about the care of individuals” - Can also apply to planning of services - (adapted from Sackett et al., 2000)

  6. Clinical state and circumstances Clinical Expertise Client Preferences and actions Research Evidence EBP Model Haynes, Devereaux, and Guyatt, 2002

  7. Elements of definition of EBP • Conscientious: ethical, effective, honest • Explicit: transparent re evidence / other reasons for decisions, with client • Judicious: considered, prudent • Best, currently available evidence: rigorous as possible, subject to updating • Client values a key part • Contrasts with authority-based knowledge

  8. Features of EBP ‘Client values’ Client part of decisions – their preferences, experiences, values etc, integrated with evidence and expertise Share evidence with client, otherwise informed consent meaningless. Need honesty, openness re. state of knowledge. Empowering if this is done. These principles are applicable at a community level.

  9. Features of EBP Anti-authoritarian Not ‘I know best’; lifelong learner, questioner, always updating Client as part of decision making team Sharing knowledge and expertise Based on respect for client and their knowledge

  10. Ethics • Ethical to do good and avoid harm by using best evidence • Ethical to involve client; fully informed consent requires open & up-to-date information about effectiveness • Many ethical codes require this • Much concern re. ‘conflict of interests’ among researchers and practitioners

  11. Why is EBP possible now? (Gambrill, 2004) • Recognition of scarce resources- need for ‘good value’ from public services; transparency, accountability. EBM well established • Pressure / activism from consumers, public. Notions of human rights • Increased attention to harm, mistakes, whistle blowing, etc • Internet / Information revolution: data bases, searching, e-publishing, accessibility • Advances in research methods- systematic reviewing, epidemiology, trial methodology

  12. The ‘5 Core Steps of EBP’ 1. Formulating answerable questions 2. Searching literature 3. Critical appraisal of research 4. Applying findings to practice 5. Evaluation of outcome

  13. Qualitative and related Work • These primary issues develop from detailed (largely) qualitative work • Mechanisms and process issues are similarly explored in these ways • Qual work should generally be in tandem with the Quant work presented here

  14. From basic research questions to evidence based practice Systematic reviews & meta-analyses . Nature & prevalence of problem. Who is it a problem for? Causal models - risk/protective factors Intervention Trials- RCT’s ‘efficacy’ Intervention Trials - RCT’s ‘effectiveness’ Practice guidelines Evidence based practice Judicious application of research to client / organisation

  15. Randomised controlled trial-RCT ‘Gold standard’ research design for evaluating intervention which attempts to minimise sources of bias Allocates participants at random to intervention and comparison groups (this is the defining feature) Uses same, meaningful reliable assessments before and after intervention Double or single ‘blind’ if possible - reduces a very important source of bias

  16. Systematic review An overview or summary of primary studies, carried our according to an explicit set of aims & methods - so review is reproducible. e.g. a set of RCT’s all addressing a similar question, or a set of studies about prevalance or causes or screening. • May include meta-analysis - quantitative summation of results combined from several similar studies • Cochrane Collaboration (& Campbell) publishes 1000’s, for intervention questions, on web

  17. An early, pioneering Randomised Controlled Trial (RCT) Cambridge-Somerville study Cabot carried out first major RCT in social work in 1930’s Massachusetts, USA (Powers & Witmer, 1951; McCord, 2001) - theory driven intervention - based on knowledge of risk factors for crime

  18. Cambridge-Somerville study: design 650 boys under 12 (mean age 10) 1935. 506 after WW2 Risk of delinquency due to poor, high crime areas Placed in matched pairs (similar age, SES etc ) Randomly assigned one of pair to intervention, other to control group Intervention lasted 5.5 years on average Follow-up: mid 1940’s; late 1970’s, age 47, 98% traced. Outcomes: records of courts, death, mental illness; Careful records of contacts and interventions kept.

  19. Cambridge-Somerville study: results 6-10 years later: found no differences between groups in behaviour or delinquency rates Note two different methods give same message. 35 year follow up: age 47, traced 98% of sample! using state records, found intervention boys more likely to have negative outcomes including: serious convictions, deaths by age 35, serious mental illness, compared to control group.

  20. Other programmes that harm? • See McCord (2003) paper on web • Systematic review of Scared Straight These gave youngsters a taste of what prison was like, adopted in 38 states Petrosino et al (2002) Campbell/ Cochrane library

  21. Common interventions that do no good/ modest evidence of harm • Rose et al. Cochrane review of brief crisis intervention following exposure to traumatic events (“de-briefing”) • With youth problem behaviour, not effective if based on scare tactics, toughness (bootcamps) , lecturing (DARE), aggregating high-risk youth (lots services), 1-1 non-directive mentoring

  22. What can we learn from studying these? 1. What sorts of interventions appear more likely to harm - or to do no good? 2. What are the mechanisms of harm? Or - What is actually is going wrong in this intervention? NB We also want to know this with interventions that go well- what are the active useful ingredients- (mediators of intervention)

  23. Factors that may make intervention more likely to harm/ do no good slides from Tom Dishion, Oregon, 2004; • The intervention target is not derived from an empirically derived model or theory (e.g., “Scared Straight” or “DARE” Drug Abuse Resistance Education) • The intervention protocol (target, strategy and context) is not clearly articulated; • The intervention staff are not trained/ supervised well with respect to implementation fidelity or held accountable for outcomes;

  24. Critiques of EBP Limitations apparently based on misconstrual (‘straw man’): • EBP only uses one method; cook-book approach; dictates to professionals, you cant do RCTs in complex situation, (etc) Social science/ intervention is different from medicine: • Human experience can’t be quantified, other kinds of evidence are just as valid; interventions & contexts are too complex for RCT Practical arguments: • Not feasible for practitioners (time resources, expertise) • Not enough evidence; Does EBP work? Does it lead to better outcomes for people?

  25. Clinical state and circumstances Clinical Expertise Client Preferences and actions Research Evidence EBP Model Haynes, Devereaux, and Guyatt, 2002

  26. Thank you • Paul.montgomery@spi.ox.ac.uk

More Related