Resources forEvidence-Based School Social Work Practice Stephanie Baus Tulane University
Workshop Outline • Overview of Evidence-Based Practice • EBP definition and model • EBP process (steps) • Range of practice questions • Resources for learning EBP skills • EBP in School Social Work • Relationship to RTI and PBS • Assessment • Interventions • Your practice questions
Workshop Outline (cont.) • Formulating practice questions • Effectiveness/prevention • Risk/Assessment • Descriptive • Resources for locating evidence • Systematic reviews of studies • Electronic databases • Search strategies • Specialized sites • Scholarly books
Workshop Outline (cont.) • Evidence-informed Assessment in SSW • Assessment questions • Locating evidence (searching, resources) • Critical appraisal • Application • Evidence-Informed Intervention in SSW • Effectiveness/prevention questions • Locating evidence (searching, resources) • Critical appraisal • Application • Iatrogenic interventions
Workshop Outline (cont.) • Evaluating Outcomes • Group designs • Single subject designs
Definition of Evidence-Based Practice “Placing the client’s benefits first, evidence-based practitioners adopt a process of lifelong learning that involves continually posing specific questions of direct practical importance to clients, searching objectively and efficiently for the current best evidence relative to each question, and taking appropriate action guided by evidence” (Gibbs, 2003, p 6).
Clinical state and circumstances Practitioner's Expertise Client Preferences and actions Research Evidence EBP Model Haynes, Devereaux, and Guyatt, 2002
Steps of EBP* • Step 1: Convert an information need into an answerable practice question. • Step 2: Efficiently locate the best evidence to answer the question. • Step 3: Critically appraise the evidence for its validity and usefulness.
Steps of EBP (cont.)* • Step 4: Using practice expertise to integrate evidence with student characteristics and school context, apply the results of the evidence appraisal to practice. • Step 5: Evaluate the outcome of evidence-based action. • Step 6: Teach others: challenges and obstacles *Based on Sackett, et al., 1997, and Gibbs, 2003
Range of Practice Questions • Effectiveness/Prevention • Assessment/Risk • Descriptive Terminology • Evidence-Based Practice (EBP) • Evidence-Informed Practice • Empirically-Supported Interventions/Treatments
Resources for Learning EBP Skills • Gibbs, L. E. (2003). Evidence-based practice for the helping professions. Pacific Grove, CA: Brooks/Cole-Thomson Learning. • Kelly, M. S., Raines, J. C, Stone, S. & Frey, A. (2010). School social work: An evidence-informed framework for practice. New York: Oxford University Press. • Raines, J. C. (2008). Evidence-based practice in school mental health. New York: Oxford university Press. • Rubin, A. (2008). Practitioner’s guide to using research for evidence-based practice. Hoboken, NJ: John Wiley & Sons.
EBP in Response to Intervention and Positive Behavior Support Models • Efficient and Reliable Assessment Tools/Procedures • Screening to identify prevalence of problems • Diagnostic evaluation of individual students • Monitoring of treatment fidelity and student progress • Empirically-Supported Interventions • Primary Tier: core curriculum to all students/ preventive • Secondary Tier: additional support to at-risk students • Tertiary Tier: intensive support to individual students
Evidence-Informed School Social Work Practice • Recent survey of over 1600 school social workers indicates • Few use online databases, journals, or scholarly books to inform practice. • Primary focus on interventions targeting individual change and risk factors rather than primary prevention. (Kelly, et al., 2008). • Your practice questions?
COPES Questions • Client-Oriented • Practical • Evidence-Search Gibbs, L. E. (2003). Evidence-based practice for the helping professions: A practical guide with integrated multimedia. Pacific Grove, CA: Brooks/Cole.
COPES Questions Client-Oriented Practical Evidence-Search Four Features of a Well-Built Question
Formulating Effectiveness/Prevention Questions: a)Comparing an intervention to no intervention b) Comparing two interventions c) What is the best intervention?
Formulating Assessment/Risk Questions: a) Comparing an assessment instrument/procedure to no assessment b) Comparing two assessment instruments/procedures c) What is the best assessment instrument/procedure?
Formulating Descriptive (Quantitative) Questions: a) Summarizing characteristics (how much, how many, what percentage, what is the average?) b) Asking about relationships between variables
Formulating Descriptive (Qualitative) Questions: a) Require narrative responses b) Suggest in-depth explorations, little known phenomena (how would they describe, what is the process, how do they experience?)
Practice Exercise Describe a practice situation: Ask a practice question: Identify question type: Formulate a COPES question:Client/problem Action Alternative Outcome
Systematic Reviews and Meta-analyses • Campbell Collaboration (Social Work, Education & Criminal Justice http://www.campbellcollaboration.org/ • National Registry of Evidence-Based Programs and Practices, SAMHSA http://www.nrepp.samhsa.gov/index.htm • What Works Clearinghouse http://www.ies.ed.gov/ncee/wwc/
On-line Databases • Subscription databases: Cochrane collaboration, SWAB, SSAB, PsycInfo, Criminal Justice Ab., Sociological Ab., Medline, CINHAIL, etc. • Free access databases: Campbell Collaboration, ERIC, Pubmed, Google Scholar, PILOTS • Government and professional sites: National Center for PTSD at http://www.ptsd.va.gov/, Information for Practice at http://ifp.nyu.edu/
Search Planning Worksheet • COPES question • Synonyms • Thesaurus • Search terms • Boolean logic • AND, OR, NOT • “wildcards” (truncation)
MOLES Methodology Orienting Locators for Evidence Searches (Gibbs, 2003)
Search Summary #1: meta-anal* or metaanal* or meta anal* or systematic review* or synthesis of studies or study synthesis (6,807) #2: delinquency prevention (250) #3: #1 AND #2 (5)
meta MOLES AND delinquency prevention 6807 250 5
Descriptive (quantitative) MOLES random* select*” OR “random* sampl*” OR survey OR questionnaire* OR “representative sample” OR “national sample” OR “interview schedule” OR “correlat*” (Baus)
Revising the Search • Keep a search history (databases, terms, hits, relevance) • Too few hits: Broaden search, include all synonyms and combinations, use controlled language, check spelling, check Boolean structure, change databases. (Removing MOLES is last resort.) • Too many hits: Narrow search, more specific terms, add categories, check Boolean structure. • Hits not relevant: Consider multiple meanings and edit terms, use controlled language, use “not.”
Evidence-Informed Assessment • Assessment questions • Resources • Kelley. M. L., Noell, G. & Reitman, David. (2003). Practitioner’s guide to empirically based measures of school behavior. New York: Kluwer Academic Publishers. • Fischer, J & Corcoran, K. (2007). Measures for clinical practice and research: A sourcebook(4th Ed.). New York: Oxford University Press. • www.ets.org/test_link/find_tests/ • Searching
Criteria for Evaluating Effectiveness Studies • Relevance (importance to serving clients, common to practice, feasible, may lead to change) • Evidence quality (control group, randomization, attrition, subjects and raters “blind,” equivalent groups) • Statistical significance (p < .05 - unlikely due to chance) • Impact of intervention (effect size, absolute risk reduction, number needed to treat, number needed to harm) (Armstrong, cited in Gibbs, 2003)
Meta-Analyses and Systematic Reviews • Differences between meta-analyses, systematic reviews and narrative reviews • Locating meta-analyses and systematic reviews • Critically appraising meta-analyses and systematic reviews
Criteria for Evaluating Descriptive Studies • Relevance (provides important information about client needs, characteristics, and perceptions) • Evidence quality (random selection or representative sample, sample size, clearly written pretested questions, response rate, appropriate generalization) • Statistical significance for relationships (p < .05 - unlikely due to chance) • Strength of relationships/effect size
Criteria for Evaluating Assessment/Risk Studies • Relevance (importance to clients, common to practice, feasible, may lead to change) • Ease of use (administration, scoring) • Evidence quality • Reliability: interrater agreement and/or Cronbach’s alpha > .70 • Validity: comparison with gold standard > .70, positive predictive value, negative predictive value (Armstrong, cited in Gibbs, 2003)
Criteria for Evaluating Qualitative Studies • Relevance (provides important understanding of client experiences, perspectives, processes) • Evidence quality • appropriate, clearly described methods • theoretical consistency • procedures to enhance trustworthiness* (credibility, dependability, confirmability, transferability) • systematic analysis • no unfounded generalization *Lincoln & Guba, 1985
Use of Checklists/Rating Forms • Rate study quality • Applying explicit criteria increases reliability • Use summary score or not?