1 / 63

SWL 579A Session 5

SWL 579A Session 5. Methodological challenges in prevention science Guest Lecturer: Eric Brown, Ph.D. School of Social Work University of Washington 10/28/09. Intervention Research.

Download Presentation

SWL 579A Session 5

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. SWL 579A Session 5 Methodological challenges in prevention science Guest Lecturer: Eric Brown, Ph.D. School of Social Work University of Washington 10/28/09

  2. Intervention Research

  3. Programs originating in practice (e.g., Homebuilders, Fountain House), those originating from intervention researchers (e.g., MC, YM), or a combination (e.g., CTC) Usually conducted by program designer but sometimes involves independent evaluator (e.g., Mathematica Policy Research)

  4. Challenges in Intervention Research

  5. Effects of SSDP Intervention on School Bonding from Age 13 to 18 Hawkins, Guo, Hill, Battin-Pearson & Abbott (2001)

  6. Methodological challenges in prevention science • Unit of analysis. • Measurement issues. • Heterogeneity of effects in different subgroups. • Assessing intervention effects on developmental change. • Attrition and missing data.

  7. 3 Design Stages • Pre-Intervention Assignment Design • Intervention Design • Post-Intervention Design

  8. Intervention Assignment • Randomization • Balance, Matching, Blocking • Cluster Random Assignment

  9. What are the Fatal Design Flaws in a Trial? • Pre-Intervention Assignment: • Extreme Selection Bias • Not a Large Enough Sample is Drawn • Intervention: • Contamination/Leakage • Participation Bias • Implementation • Participation • Adherence • Dosage • Drop-out during intervention period

  10. Intervention Design • Intervention and control Subjects are different • Contamination • Randomized at wrong level • Low intervention delivery • Large drop-outs

  11. Post-Intervention Design • Large attrition • Differential attrition • Differential measurement error

  12. When it’s not possible to randomize or when randomization fails… • Propensity score = Probability of receiving intervention given observed covariates. • Often estimated using logistic regression. • Discriminates between experimental and control groups. • Can be thought of as a “balancing score” • Within groups with similar propensity scores, distribution of covariates will be similar across experimental and control groups. • Allows post hoc matching based on propensity score instead of all covariates directly.

  13. Propensity Scores (continued) • Can also be used when follow-up of all participants is not possible. • Reduces non-intervention related differences between experimental and control groups. • Gives better estimates of intervention effects (reduced bias).

  14. Statistical Power Hypotheses: H0: μ1 = μ2 HA: μ1 ≠ μ2 Type I error rate (α): probability of rejecting H0 when TRUE Type II error rate (β): probability of accepting H0 when FALSE Examples: H0: Unsafe to cross the street HA: Safe to cross the street  H0: The defendant is innocent HA: The defendant is guilty Power = 1 – β: probability of rejecting H0 when FALSE.

  15. Strategies to Increase Power • Increase sample size • Balance, then randomize • Use multiple outcome measures • Draw more homogeneous sample • Analyze at multiple (appropriate) levels

  16. Example:The Community Youth Development Study (CYDS) A randomized controlled trial to test the effectiveness of the Communities that Care prevention operating system.

  17. Communities That Care To foster healthy youth development in communities: • Reduce levels of risk • Increase levels of promotion and protection • Reduce levels of youth substance use, violence, and other problem behaviors

  18. Community Youth Development Study (CYDS) CYDS will test if CTC increases positive youth development in communities.

  19. CYDS Research Questions • Does CTC improve community planning and decision making? (Process Outcomes) • Does full installation of CTC affect targeted risk and protective factors and healthy or problem behaviors? (Behavior Outcomes)

  20. Theory of Change for Communities That Care Prevention System Transformation Adoption of Science-based Approaches Collaboration Appropriate Prevention Program Selection and Implementation CTC Implementation and Technical Assistance Community Support Decreased Risk and Enhanced Protection Community Norms Social Development Strategy Positive Youth Development System Outcomes System Catalyst System Transformation Constructs

  21. CYDS Design: Community Selection and Matching • Forty one free standing incorporated towns of less than 100,000 population were chosen and matched in 1997. • The community pairs are similar in: • Population Size • Demographic Diversity • Crime Statistics • Socioeconomic composition • Drug use

  22. CYDS Design (cont.) • The Diffusion Project (1997-2002) found that in 13 community pairs, neither community was using prevention science to guide its drug abuse prevention efforts. • 12 sets of these paired communities agreed to be randomly assigned to CTC or control condition in 2003.

  23. STUDY DESIGN Randomized Controlled Trial 2003-2008 2003 2004 2005 2006 2007 2008 Implement selected interventions Planning 5-Year Baseline 1997-2002 CTCYS CTCYS CTCYS CKI CRD CKI CRD Intervention 98 99 ‘00 ‘01 ‘02 CTC Board CTC Board CTC Board CTC Board CTC Board Randomize CTCYS CTCYS CTCYS YDS YDS YDS YDS YDS CKI CRD CKI CRD CTCYS CTCYS CTCYS Control CKI CRD CKI CRD CTCYS: Cross-sectional student survey of 6th-, 8th-, 10th-, and 12th-grade students using the CTC Youth Survey CKI: Community Key Informant Interview CRD: Community Resource Documentation measuring effective prevention programs and policies in the community  CTC Board: CTC Board Member Interview YDS: Longitudinal Youth Development Survey of students in the class of 2011 starting in 5th grade in spring 2004 YDS YDS YDS YDS YDS

  24. Measurement Tools • CTC Team Member Interviews to measure the CTC process in each community • Community Key Informant Interviews to measure adoption of prevention science as a planning framework for preventive action in communities • Community Resource Documentation system to assess location and reach of programs consistent with proven prevention approaches • Student surveys to measure risk and protection and youth problem behaviors

  25. STUDY DESIGN Randomized Controlled Trial 2003-2008 2003 2004 2005 2006 2007 2008 Implement selected interventions Planning 5-Year Baseline 1997-2002 CTCYS CTCYS CTCYS CKI CRD CKI CRD Intervention 98 99 ‘00 ‘01 ‘02 CTC Board CTC Board CTC Board CTC Board CTC Board Randomize CTCYS CTCYS CTCYS YDS YDS YDS YDS YDS CKI CRD CKI CRD CTCYS CTCYS CTCYS Control CKI CRD CKI CRD CTCYS: Cross-sectional student survey of 6th-, 8th-, 10th-, and 12th-grade students using the CTC Youth Survey CKI: Community Key Informant Interview CRD: Community Resource Documentation measuring effective prevention programs and policies in the community  CTC Board: CTC Board Member Interview YDS: Longitudinal Youth Development Survey of students in the class of 2011 starting in 5th grade in spring 2004 YDS YDS YDS YDS YDS

  26. Repeated Cross-Sectional Youth Surveys - CTC Survey • Target samples include all 6th, 8th, 10th, and 12th grade public school students in each community (Total N’s range from 160 - 2000 students per community) • Used to prioritize specific risk and protective factors for attention • Provide data on population trends in risk, protection, and outcomes in each community from 1998 to 2008

  27. STUDY DESIGN Randomized Controlled Trial 2003-2008 2003 2004 2005 2006 2007 2008 Implement selected interventions Planning 5-Year Baseline 1997-2002 CTCYS CTCYS CTCYS CKI CRD CKI CRD Intervention 98 99 ‘00 ‘01 ‘02 CTC Board CTC Board CTC Board CTC Board CTC Board Randomize CTCYS CTCYS CTCYS YDS YDS YDS YDS YDS CKI CRD CKI CRD CTCYS CTCYS CTCYS Control CKI CRD CKI CRD CTCYS: Cross-sectional student survey of 6th-, 8th-, 10th-, and 12th-grade students using the CTC Youth Survey CKI: Community Key Informant Interview CRD: Community Resource Documentation measuring effective prevention programs and policies in the community  CTC Board: CTC Board Member Interview YDS: Longitudinal Youth Development Survey of students in the class of 2011 starting in 5th grade in spring 2004 YDS YDS YDS YDS YDS

  28. Longitudinal Youth Surveys -Youth Development Study (YDS) • Target samples include all 5th grade public school students in each community recruited in 2003 and 2004 • Provide data on individual level changes in risk, protection, and outcomes from 5th through 9th grades in each community • Provide data on individual students’ exposure to prevention activities in each community

  29. Panel-Youth Development Survey (YDS) • Annual survey of panel recruited from the Class of 2011 (5th grade in 2004) • Active, written parental consent

  30. STUDY DESIGN Randomized Controlled Trial 2003-2008 2003 2004 2005 2006 2007 2008 Implement selected interventions Planning 5-Year Baseline 1997-2002 CTCYS CTCYS CTCYS CKI CRD CKI CRD Intervention 98 99 ‘00 ‘01 ‘02 CTC Board CTC Board CTC Board CTC Board CTC Board Randomize CTCYS CTCYS CTCYS YDS YDS YDS YDS YDS CKI CRD CKI CRD CTCYS CTCYS CTCYS Control CKI CRD CKI CRD CTCYS: Cross-sectional student survey of 6th-, 8th-, 10th-, and 12th-grade students using the CTC Youth Survey CKI: Community Key Informant Interview CRD: Community Resource Documentation measuring effective prevention programs and policies in the community  CTC Board: CTC Board Member Interview YDS: Longitudinal Youth Development Survey of students in the class of 2011 starting in 5th grade in spring 2004 YDS YDS YDS YDS YDS

  31. Youth Development Survey (YDS) • Participants recruited in grades 5 and 6. • Final consent rate = 76.4%

  32. 2007 YDS • 96.2% Overall Student Participation • 11.9% (n=525) have moved out of project schools

  33. The CONSORT (Consolidated Standards of Reporting Trials) Statement… was developed by a group of clinical trialists, biostatisticians, epidemiologists and biomedical editors as a means to improve the quality of reports of randomized controlled trials (RCTs).

  34. CONSORT Flow Diagram …shows the progress of participants throughout a RCT. Thoma et al., 2006

  35. CYDS Youth Development Study (panel sample) Grade 7 CONSORT flow diagram Hawkins et al., 2009

  36. What is the unit of analysis in your study? • Are there multiple units of analysis in your study? • Does the unit(s) of analysis in your study correspond to your theory of change? • Does the unit of analysis in your study correspond to the unit of randomization? • Do you have enough units to do the appropriate statistical analysis? …to have sufficient statistical power? Unit of analysis

  37. Three-level Pre-post ANCOVA Model Level 1 (Studenti): G8Alc30ijk = π0jk + π1jk(G5Alc30ijk) + π1jk(Ageijk) + π2jk(Genderijk) + π3jk(Whiteijk) + eijk Level 2 (Communityj): π0jk = β00k + β01k(Intervention Statusjk) + β02k(Populationjk) + β01k(PctFRLjk) + r0jk Level 3 (Matched-Pairk): β00k = γ000 + u00k

  38. Four-level (Latent) Growth Model Level 1 (Time t): Alc30tijk= π0ijk + π1ijk(Timetijk) + etijk Level 2 (Student i): π0ijk= β00jk + β01jk(Ageijk) + β02jk(Sexijk) + β03jk(Whiteijk) + β04jk(Hispanicijk) + r0ijk π1ijk= β10jk + β11jk(Ageijk) + β12jk(Sexijk) + β13jk(Whiteijk) + β14jk(Hispanicijk) + r1ijk Level 3 (Community j): β00jk= γ000k + γ001k(Intervention Statusjk) + γ002k(Populationjk) + γ003k(PctFRLjk) + u00jk β10jk= γ100k + γ101k(Intervention Statusjk) + γ102k(Populaitonjk) + γ103k(PctFRLjk) + u10jk Level 4 (Community-Matched Pair k): γ000k= ξ0000 + v000k

  39. How did you select your measures? • Can the variables in your study be measured by a single item (question)? Or do you need multiple items (questions) to measure the phenomena? • Do the response options for the variables that you use in your study cover the full range of the phenomena? Should you dichotomize a continuous variable? • If you have multiple items, how did you put them together to measure the construct? Measurement issues

  40. Measurement issues(continued) • Are the measures normally distributed? • Are the variables in your study directly observable (manifest)? Or are they unobservable (latent)? • Do your variables measure the same construct across different groups or over different time periods?

  41. Are your measures reliable? • Reliability = the consistency of your measurement, or the degree to which an instrument measures the same way each time it is used under the same condition with the same participants. • Test – Retest • Internal Consistency Measurement issues (continued)

  42. Are your measures valid? • Validity = the "best available approximation to the truth or falsity of a given inference, proposition or conclusion” (Cook & Campbell, 1979). • Internal Validity • External Validity • Construct Validity • Concurrent / Predictive Validity • “Face” Validity Measurement issues (continued)

  43. Example: Community Leader Support for Prevention • If you were deciding how to spend money for reducing substance abuse, what percentage would you allocate to prevention?

  44. Example: Stages of Adopting of Science-Based Approach to Prevention Stage 0: No awareness Stage 1: Awareness of prevention science terminology/concepts Stage 2: Using risk and protection-focused prevention approach as a planning strategy. Stage 3: Incorporation of epidemiological data on risk and protection in prevention system. Stage 4: Selection and use of tested and effective interventions to address prioritized risk and protective factors. Stage 5: Collection and feedback of program process and outcome data and adjustment of interventions based on data.

  45. Example: Prevention Collaboration 1=Strongly disagree, 2=Somewhat disagree, 3=Somewhat agree, 4=Strongly agree

  46. Example: • Self-reported frequency of substance use in the • Raising Healthy Children Project • (RF Catalano, PI; see Brown et al., 2005 for details) 0 = No use 1 = once or twice 2 = three to five times 3 = six to nine times 4 = 10 to 19 times 5 = 20 to 39 times 6 = 40 or more times 11

  47. Frequency distributions for alcohol use in past year (total sample) GRADE 6 GRADE 7 GRADE 8 GRADE 9 GRADE 10 14

  48. Frequency distributions formarijuana use in past year (total sample) GRADE 6 GRADE 7 GRADE 8 GRADE 9 GRADE 10 15

  49. Assessing intervention effects on developmental change • Are you assessing intervention effects during the appropriate developmental period? • How are you measuring “change?” Linearly? By a particular growth function? • Do you have enough measurement occasions (time points) to accurately measure change? • What are you measuring change in? Incidence? Prevalence?

  50. y1 y2 y3 f y4 y5 A Latent Variable Model e1 e2 e3 Intervention Status e4 e5

More Related