Designing implementing randomized controlled trials for community based psychosocial interventions
1 / 127

Designing & Implementing Randomized Controlled Trials For Community-Based Psychosocial Interventions - PowerPoint PPT Presentation

  • Uploaded on

Designing & Implementing Randomized Controlled Trials For Community-Based Psychosocial Interventions. Phyllis Solomon, Ph.D. Professor School of Social Policy & Practice University of Pennsylvania March 17, 2010. Overview of Workshop. Introduction So you think you want to do an RCT?

I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
Download Presentation

PowerPoint Slideshow about 'Designing & Implementing Randomized Controlled Trials For Community-Based Psychosocial Interventions' - nairi

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Designing implementing randomized controlled trials for community based psychosocial interventions

Designing & Implementing Randomized Controlled Trials For Community-Based Psychosocial Interventions

Phyllis Solomon, Ph.D.


School of Social Policy & Practice

University of Pennsylvania

March 17, 2010

Overview of workshop
Overview of Workshop Community-Based Psychosocial Interventions

  • Introduction

  • So you think you want to do an RCT?

  • RCT Ethical Considerations

  • Planning an RCT

  • NIH Exploratory Research Grants

  • Developing Conceptual Foundation

  • Designing an RCT

  • Implementing an RCT

  • Generalizing RCT Outcomes


Introduction Community-Based Psychosocial Interventions

What is an rct
What is an RCT? Community-Based Psychosocial Interventions

  • True experimental design. Participants assigned by chance, following consent, to one of at least two conditions

  • Key features of classic experimental design:

    • Random assignment

      • determines who assigned to which group

    • Pre & post tests

      • outcome measured before & after intervention

    • Control group

      • same experiences as experimental group except no exposure to experimental stimulus

What is an rct1
What is an RCT? Community-Based Psychosocial Interventions

  • Can have more than two groups

  • Sometimes no pre-test measures

  • Chance does not necessarily mean equal, but known probability

Community based psychosocial interventions
Community-Based Community-Based Psychosocial InterventionsPsychosocial Interventions

  • Psychosocial Intervention – any service, program, educational curriculum, or workshop whose goal is to produce positive outcomes for individuals confronted with social &/or behavioral issues & challenges

  • Community-Based -Conducted in agency & social work settings

Community based psychosocial interventions1
Community-Based Community-Based Psychosocial InterventionsPsychosocial Interventions

  • Community-based psychosocial intervention – reflects impact of environmental context in which interventions are imbedded, on clients and providers & interactions between both/all systems

  • Less control, more complex environmental context with participants with multiple problems

Rct vs evaluation
RCT vs. Evaluation Community-Based Psychosocial Interventions

  • Research uses scientific methodology to generate generalizable knowledge

  • Evaluation uses same methodology but primary goal is not for generalizable knowledge

  • For NIH grants do not use term evaluation

Rct vs evaluation1
RCT vs. Evaluation Community-Based Psychosocial Interventions

  • In evaluation RCT known as experimental study or a randomized field experiment

  • Both examine a program or policy

  • Both addresses effectiveness & cost effectiveness

  • Evaluation experimental studies closely resemble community-based psychosocial RCTs

    • literature in this area may be helpful

Rct vs evaluation2
RCT vs. Evaluation Community-Based Psychosocial Interventions

  • Purpose of RCTs & field experiments may differ

  • RCT – research –generalized knowledge

  • Field experiments – evaluation – answer local questions – but also policy questions of broader application

  • Semantic difference

Psychosocial community based interventions effectiveness studies
Psychosocial Community-Based Interventions = Effectiveness Studies

  • Efficacy studies occur under ideal or optimum conditions

  • Effectiveness studies occur in “real world”

  • Efficacy studies greater internal validity

  • Effectiveness studies greater external validity

So you think you want to do an rct

So You Think You Want StudiesTo Do An RCT?

Appraising whether to move forward with an rct
Appraising Whether to Move Forward with an RCT Studies

Preliminary questions to be addressed before moving forward:

  • Is the question well justified?

  • Is the question an important one to answer?

  • Is the question addressing a gap in the literature?

  • Is the question an ethical one?

  • Is the question posing the correct question?

  • Would you fund this RCT?

Appraising whether to move forward with your rct
Appraising Whether to Move Forward with Your RCT Studies

Case Example 1

  • Is a 90 day Advanced Practice Nurse-Transitional Care Model more effective than usual discharge in improving adherence to treatment & quality of life for persons with SMI being released from a psychiatric hospital?

    • “hand-off” from hospital to home of SMI linked to gaps in delivery of MH services

    • Consequently high rates of rehosp & poor outcomes

    • EBP – Advanced Practice Nurse-Transitional Care Model improves outcomes following acute medical care discharge for elderly adults with complex medical problems

Appraising whether to move forward with rct
Appraising Whether to Move Forward with RCT Studies

Case Example 1(continued)

- Intervention hybrid of case management, disease management, & home health care

- Nurse works with hospital team to develop discharge plan & then implement in the community

- Believe adapting this intervention potential to be equally successful with adults with SMI being discharge from acute hospital

Appraising whether to move forward with rct1
Appraising Whether to Move Forward with RCT Studies

Case Example 2

  • Is Multidimensional Treatment Foster Care (MTFC) Program more effective in reduction of disruptive behaviors than traditional Therapeutic Foster Care (TFC) among children in foster care?

    - Instability in foster care placement ranges from 22%-56%

    - Instability in placement due to child’s disruptive behaviors

    - TFC typically used for children with more demanding emotional & behavior needs & has more intensive structure & MH services

Appraising whether to move forward with rct2
Appraising Whether to Move Forward with RCT Studies

Case Example 2 (continued)

- Data on disruptions for TFC sparse but estimated 38%-70%

- Limited evidence on TFC effectiveness – most studies descriptive, methodologically flawed

- Lack of clear standards & specification of actual implementation of TFC

-MTFC – manualized intervention with goals to improve well-being & reduce disruptions

- MTFC placement augmented with coordinating an array of clinical interventions in family, school, & peer group

Appraising whether to move forward with rct3
Appraising Whether to Move Forward with RCT Studies

Case Example 3

Is CBT for adolescents with sickle cell disease (SCD) more effective than medical management of the disease in increasing coping strategies?

- adolescents with SCD have a number of adjustment difficulties that have received little attention

- some psychosocial difficulties include stress-processing

e.g. decreased coping strategies, lack of knowledge of SCD

- need to promote biological & psychosocial adjustment

Appraising whether to move forward with rct4
Appraising Whether to Move Forward with RCT Studies

Case Example 4

  • Is Forensic Assertive Community Treatment (FACT) more effective than forensic intensive case management (FICM) in a variety of psychosocial and clinical outcomes for homeless adults with SMI leaving jail?

    - Pop. has multiplicity of needs due to mental illness, homelessness, & criminal justice involvement

Appraising whether to move forward with rct5
Appraising Whether to Move Forward with RCT Studies

Case Example 4(continued)

- cognitive deficits & poor social skills complicate ability to coordinate efforts to meet needs

- FICM single point of planning, monitoring & accountability considered beneficial for this pop.

- FICM specialized ICM

- FACT –team approach (shared caseload), self contained intervention to meet all needs of client – includes psychiatrist, case managers, etc.

- Based on ACT for criminally involved

Rct ethical considerations1
RCT Ethical Considerations Studies

  • Appropriate question to ask

  • Who ethically eligible to randomize

  • What ethical comparison

  • How & when to randomize

  • When are providers human subjects

  • What is ethical responsibility at termination

Justifying the rct to doubters
Justifying the RCT to Doubters Studies

  • Want to provide most effective services to clients

  • Expectation when treated by a doctor

  • RCTs best means to making causal inference with high degree of confidence

  • Unethical to offer untested intervention

  • Not denying better treatment to controls

    • if answer known, there would be no need for study

  • Frequently those who receive services determined on a haphazard or a biased basis

Ethical justification for randomization
Ethical Justification For Randomization Studies

  • Lack of adequate evidence of effectiveness of exp. intervention understudy

  • Experimental intervention theoretically justified to potentially benefit target pop.

  • Uncertainty of effectiveness (equipoise) – otherwise no scientific basis for RCT

Principle of equipoise
Principle of Equipoise Studies

  • Substantial degree of uncertainty / ambiguity necessary

    • Specific population

    • Setting

Integration of practice research ethics
Integration of Practice & Studies Research Ethics

  • Practice – interventions designed solely to enhance well-being of client & has reasonable chance of success (Belmont Report, 1979)

  • Research – activities designed to test hypothesis, permit conclusions to be drawn, thereby contribute to generalized knowledge (Belmont Report, 1979)

  • RCTs = Practice & Research

Integration of practice research ethics1
Integration of Practice & Studies Research Ethics

  • Practice ethics = human subject protections

    – may conflict w/ scientific rigor

  • Participant deterioration in experimental condition results in biased attrition

  • Exclusion criteria for clinical reasons – reduce external validity

Ethics of scientific untested interventions
Ethics of Scientific Studies Untested Interventions

  • Experimental intervention at least as effective as TAU

  • Do no harm - even if voluntarily consents

  • Risks assessment for participant

    • Extends to others & community-at-large

Ethics of selecting control group
Ethics of Selecting Control Group Studies

  • Justify no service comparison

    • Gas to no gas

  • Waitlist may be justified if agency normally has waiting list, or no service offered

  • Inert intervention may be justified

  • TAU may be most justifiable comparison

Consent forms
Consent Forms Studies

  • Must inform potential participant that will receive experimental intervention by chance

    • i.e., like flipping a coin

  • Indicate chance of receiving experimental intervention

    • equal chance or 1 out of 3 chance

  • People grasp natural frequencies rather than probabilities

  • Consent forms1
    Consent Forms Studies

    • Describe all interventions

    • Merely saying ‘standard care’ not helpful

    • Remember need to provide reasonable information to make a decision

    • Dishonest to promise benefit – uncertainty justification for study

    • Need to ensure non-participation will not jeopardize usual services to which entitled

    When to gain consent
    When to Gain Consent Studies

    • Gain consent prior to random assignment

    • Unethical to indicate allocate by chance when already assigned

    • If assigned prior to consent, require two separate consent forms

    • Allocation prior to consent - result in biased attrition

    Multiple consent forms
    Multiple Consent Forms Studies

    • Screening for eligibility may require consent form

    • Children require assent & possible multiple consents

    • Process assessments may require consents from family members, providers etc

    Rct providers
    RCT Providers Studies

    • Consents for providers – When are consent forms needed?

    • Need for Federal-Wide Assurance

    Incentive payments to participants
    Incentive Payments to Participants Studies

    • Negotiate payments with agencies

      • Clients

      • Providers

    • Types of payments

    Responsibilities at termination of rct
    Responsibilities at StudiesTermination of RCT

    • Provision for ongoing care of participants

    • Experimental service to control condition

    • Feedback & dissemination to agency

    Data safety monitoring
    Data Safety & Monitoring Studies

    • NIH require Board for RCT oversight

    • Often 3-4 members – meet quarterly in person or via phone

    • Report adverse events – also to IRB

    • Review of adverse events

    Considerations for internet rct
    Considerations for Internet RCT Studies

    • Consents handled either by mail or via Internet

    • Monitored or unmonitored interventions

    • Are internet communities public or private spaces?

    • Consent forms – need to specify potential risks due to internet

    Determining whether to undertake an rct
    Determining Whether to StudiesUndertake an RCT

    • Selecting a site

      • Pipeline of available & willing eligible participants

      • Setting prepared & willing to commit & support RCT

        • Financially, space, & supervision

        • Others willing to financially support

        • Sustainability of effective intervention

    Negotiating with the setting
    Negotiating with the Setting Studies

    • Top down & bottom up approach

    • Honesty in negotiating

      • “You’ll hardly know we are here”

      • Collaborative partnership

    Real score
    REAL SCORE Studies

    • Respect for providers & clients

    • Establish credibility

    • Acknowledge strengths

    • Low burden

    • Shared ownership – reciprocity

    • Collaborative relationship

    • Offer incentives – be responsive & appreciative

    • Recognize environmental strengths

    • Ensure trust – be sure providers feel heard

    Feasibility pilot studies
    Feasibility & Pilot Studies Studies

    • Worthiness, practicality, feasibility & acceptability of intervention

    • Modification of intervention for new population

    • Pilot testing recruitment, retention, & data collection

    • Estimate required sample size

    Defining treatment program manuals
    Defining Treatment / StudiesProgram Manuals

    • Specifies:

      • Intervention

      • Standards for evaluating adherence

      • Guidance for training

      • Quality assurance & monitoring standards

      • Facilitation or replication

      • Stimulates dissemination & replication

        (Carroll & Rounsaville, 2008)

    Treatment program manuals
    Treatment / Program Manuals Studies

    • Brief literature review

    • Guidelines for establishing therapeutic relationship

    • Defining & specifying intervention

    • Contrast to other approaches

    • Specific techniques & content

    • Suggestions for sequencing activities

      (Carroll & Rounsville, 2008)

    Treatment program manuals1
    Treatment / Program Manuals Studies

    • Suggestions for dealing with specific problems

    • Implementation issues

    • Termination issues

    • Qualifications of providers

    • Training providers

    • Supervising of providers

      (Carroll & Rounsville, 2008)

    Treatment program manuals2
    Treatment / Program Manuals Studies

    Deal with structural aspects

    - Caseload

    - Staff qualifications

    - Location/setting

    e.g., space

    - Integration into service setting

    (Carroll & Rounsville, 2008)

    Criticisms of treatment manuals
    Criticisms of Treatment Manuals Studies

    • Limited application to diversified population with complex problems

    • Overemphasis on specific techniques – rather than competencies

    • Focus on technique rather than theory

    • Reduction of provider competence

    • Lack of applicability to diverse providers

    • Designed for highly motivated & single problem clients

    Adapting existing manuals
    Adapting Existing Manuals Studies

    • Use of qualitative methods

      • Focus groups

      • In-depth interviews

      • Group processes – nominal group process, Delphi method, & concept mapping

      • Ethnographic methods

    Fidelity assessment
    Fidelity Assessment Studies

    • Determining whether the intervention was conducted as planned and is consistent with service or program elements delineated in manual, including structures & goals

    • Fidelity measure

      – scale or tool assessing adequacy of implementation of service or program

      - means to quantify degree to which program service elements or services are implemented

    Leakage assessment
    Leakage Assessment Studies

    • Assesses degree of contamination

    • Captures degree to which participants in control condition receive services planned only for experimental intervention

    Developing piloting fidelity assessment
    Developing & Piloting StudiesFidelity Assessment

    • Self report measures

    • Chart reviews

    • Observations

    • Data extraction from billing forms

    • Service logs

    • Video tapping

    • Administrative data

    Steps in developing a fidelity measure
    Steps in Developing Studiesa Fidelity Measure

    • Define purpose of fidelity scale

    • Assess degree of model development

    • Identify model dimensions

    • Determine if appropriate fidelity scales already exist

    • Formulate fidelity scale plan

    • Develop items

    • Develop response scale points

    Steps in developing a fidelity measure1
    Steps in Developing Studiesa Fidelity Measure

    • Choose data collection sources & methods

    • Determine item order

    • Develop data collection protocol

    • Train interviewers / raters

    • Pilot Scale

    • Assess psychometric properties

    • Determine scoring & weighting of items

      (Bond et al, Nov 2000)

    R34 research mechanism
    R34 Research Mechanism Studies

    • Purpose

      – to evaluate feasibility, tolerability, acceptability & safety of novel approaches to improving mental health & modifying health risk behavior

      - to obtain preliminary data needed as prerequisite to efficacy or effectiveness intervention or service study

      Key purpose - data for larger scale (R01) study

    R34 research objectives relevant to rcts
    R34 Research Objectives StudiesRelevant to RCTs

    • Development & pilot testing new or adapted intervention

      • Examples

        • Develop, adapt, or revise intervention for different target population

        • Testing & refining intervention manual

        • Development or adaptation of measures,

          • e.g., provider competency, adherence to protocol, implementation fidelity measures

        • Pilot test of efficacy trial

    R34 research objectives relevant to rcts1
    R34 Research Objectives StudiesRelevant to RCTs

    • Adaptation & pilot testing for effectiveness

      • process of moving from efficacy research to effectiveness research

        • Feasibility studies to assess parameters for conducting efficacy intervention in “real world service environment”

        • Standardization of research instruments

        • Studies to develop & standardize training protocols, supervisory standards, or implementation of fidelity procedures

    Example of process of adapting effective intervention
    Example of Process of Adapting Effective Intervention Studies

    • Use of qualitative interviews with participants & social supports to assess needs & role of mental illness for specific cultural group

    • Use Advisory Board

      • Logic Model Process

      • Identify & prioritize determinants based on qualitative data

    • Review of past & current existing programs

    • Develop intervention plan & theory

    • Focus group assessment of intervention plan

    • Develop process & outcome plan

    Example of developing intervention for rct
    Example of Developing StudiesIntervention for RCT

    • Adding criminogenic component to multifaceted biopsychosocial treatment model for mental ill offenders in prison

    • Criminogenic component based on CBT –cognitive restructuring

    • Need to assess criminal thoughts & attitudes of mentally ill offenders

    Example of developing intervention for rct1
    Example of Developing StudiesIntervention for RCT

    • Use 2 existing measures to assess these factors that have been used with non-mentally ill offenders

    • Determine if factor structure for these measures same as for non-mentally ill

    • Cluster analyses of these two measures and DSM disorders for implications for structuring criminogenic component

    Example of refining existing intervention
    Example of Refining StudiesExisting Intervention

    • New conceptual model with measuring service context variables & moderator variable to determining effects on outcomes

    • Quantitatively assessing conceptual model

      • Test utility of model

      • Estimate effect sizes of predictor variables & outcomes

    • Qualitative component – examine experiences of implementing intervention & identify factors that promote or inhibit effectiveness of intervention

    • Refine model based on results & more definitively operationalize service context variable & implementation of intervention

    Conceptual foundations for rcts
    Conceptual Foundations for RCTs Studies

    • Theories for RCTs support explanatory models of process & outcomes

    • Frameworks that delineate role of intervention in affecting change

    • Empirical base justifies change over time – expected timeframe for specific levels of change

    Common theories for interventions
    Common Theories for Interventions Studies

    • Cognitive Behavioral Theory

    • Social learning theory

    • Stress, Coping, & Adaptation

    • Social Support

    • Social Capital

    • Health Beliefs

    • Theory of Planned Behavior/Theory of Reasoned Action

    • Transtheoretical Model of Change

    Stronger theoretical models
    Stronger Theoretical Models Studies

    • Mediators

      • variable that is hypothesized to help make change happen

      • Conceptual link in the middle of cause & effect argument

      • Sometimes referred to as intervening or process variable

      • Mechanisms of change in outcomes associated with the intervention & precede outcome

    Mediation diagram
    Mediation Diagram Studies




    Mediation Studies

    • Step 1 Show intervention variable is correlated with outcome

    • Step 2 Show intervention variable is correlated with mediator

    • Step 3 Show mediator affects outcome

    • Step 4 To establish mediation, effect of intervention on outcome, controlling for mediator should equal 0 or greatly reduced (partial mediation)

    Stronger theoretical models1
    Stronger Theoretical Models Studies

    • Moderators

      • Variable that interacts with intervention in such a way that interaction variable has a different effect or strength of the effect on the outcome

      • Moderators alter strength of causal relationship

        • e.g., psychotherapy may reduce depression more for men than women or high risk youths do better on outcomes

    Stronger theoretical models2
    Stronger Theoretical Models Studies

    • Moderators associated with service context &/or service population e.g., Police intervention program for persons with mental illness (Crisis Intervention Team) moderated by available MH treatment programs in community

    • Moderator analysis assess external validity – answers question of how universal is causal effect

    Moderator diagram
    Moderator Diagram Studies




    Experimental intervention compared to what
    Experimental Intervention Compared to What? Studies

    • Essence of RCT question is “Compared to What?”

    • Need to consider what usual care is – TAU

    • If no usual care, nothing or waitlist appropriate comparison

    • Benign intervention, such as supportive or educational interventions, not expected to have deep or lasting impact on outcome measure

    • Control condition used to control for attention or placebo effect as could affect outcome

    Examples of comparisons
    Examples of Comparisons Studies

    • Consumer Case Management Teams compared to Non-Consumer Case Management Teams

      • Outcomes essentially same for both teams

      • Limitation –Could be both team equally ineffective – with no control condition this alternative hypothesis could not be ruled out

    Examples of comparison
    Examples of Comparison Studies

    • Problem–Solving Educational intervention compared to depression education materials & referral for antidepressant medication among elderly with depressive symptoms receiving home health care for their medical problems

      • Standard care alone not felt to be strong comparison to determine effectiveness

      • Limits external validity of study results

    Be sure design matches the policy relevant question
    Be Sure Design Matches The Policy Relevant Question Studies

    Is Mental Health Treatment Court (MHTC) more effective than usual adversarial court processing in reducing criminal activity and improving their psychosocial functioning for adults with mental illness involved in the criminal justice system?

    - MHTC part of large movement of “therapeutic jurisprudence”

    _ designed to reduce arrests & jail time by addressing psychosocial needs of indiv.

    - MHTC involves cooperative agreements between criminal justice & MH treatment providers

    Be sure design matches the policy relevant question1
    Be Sure Design Matches The StudiesPolicy Relevant Question

    - designed to reduce arrests & jail time by addressing psychosocial needs of indiv.

    - MHTC involves cooperative agreements between criminal justice & MH treatment providers

    - Indivs. served poor tx compliance lead to erratic behaviors, but safely be diverted from criminal justice system

    - ACT is EBP for helping persons with SMI

    - MHTC incorporated an ACT approach

    - Adversarial court processing received usual MH services

    What design captures the relevant policy question for case example
    What Design Captures the Relevant Policy Question For Case Example?

    • Design employed:

      R: MH Court +ACT

      R: TAU Court + TAU MH services

    Problems with design employed
    Problems with Design Employed Example?

    • Study provided most positive evidence of MH Courts

    • However, was it MH Court or ACT?

    • Or, interaction of the two?

    • Do not know

    Design required to answer policy relevant question
    Design Required to Answer Policy Relevant Question Example?

    R: MHC + TAU MH services

    R: MHC + ACT

    R: TAU court + ACT

    R: TAU court + TAU MH services (Control Condition)

    Design required to answer policy relevant question1
    Design Required to Answer Policy Relevant Question Example?

    • Policy relevant design provides attribution to outcome of

      • Court

      • ACT

      • Interaction of the two

  • However require sufficient sample size of eligible & willing participants

  • Controlling for contamination
    Controlling for Contamination Example?

    • Referred to as blurring of conditions, drift, or treatment dilution

    • Ways contamination may occur:

      • Control condition participants gain benefit from experimental condition

      • Experimental condition drifts toward control condition

      • Control condition drifts toward experimental

    Examples of drift
    Examples of Drift Example?

    • Caused by either provider or client behavior

    • Drift between ACT & individual intensive case managers

      • Individual case managers from same agency began functioning as a team

      • Resulted in blurring of conditions

    • Clients sharing same waiting room

    • Behavioral anger management intervention with homework assignments taking place in a residential treatment setting

    Potential solutions
    Potential Solutions Example?

    • Different locations

    • Different times of operation

    • Different providers delivering the exp. & control interventions

      But these solutions raise additional confounds

      - result in different types of clients

      - providers with different qualifications & experience

    Design consideration to address contamination
    Design Consideration to Example?Address Contamination

    • Provider qualifications

    • Training providers

    • Ongoing support to providers

    • Monitoring of interventions

    Changes in intervention environment
    Changes in Example?Intervention Environment

    • History internal threat

    • Policy change may affect one or both conditions

    • Becomes a confound when interacts with one condition differentially to outcome

    • One proposed strategy is nested RCT in a longitudinal quasi experimental design

    • Another is conducting continuing ongoing process assessment

    Biased attrition
    Biased Attrition Example?

    • Biased attrition to one condition or the other is real threat to internal validity

    • E.g., Concern of biased attrition in control condition of ACT homeless jail study

    • Loss also reduces power

    Potential design solutions to attrition
    Potential Design Solutions to Attrition Example?

    • Protocol designed to engage & keep participants engaged

    • Pre-randomization introductory phase absorbing early stage attrition

      • Trade off – reduced external validity

    • Increased incentive payment at points expect greater loss

      • e.g., Exit from prison

    • More participants assigned to condition with greater anticipated loss

    • Statistical procedures – require anticipation to obtain necessary data

    Randomization Example?

    • Usually equal assignment to all conditions

    • Unequal assignment requires justification

    • Computer randomization preferred to physical manipulation

    When to use stratified randomization
    When to Use Stratified Randomization? Example?

    • Randomization may not ensure equal proportions across conditions

    • When sample size small

      • e.g., less than 100

    • Subpopulation small

      • e.g., less than 20%

    • Bigger problem of small samples – low power

    • Increases complexity

    When to use cluster randomization
    When to Use Cluster Randomization? Example?

    • Control for contamination

      • e.g., same providers delivering two interventions

    • Efficiency – everyone in intervention served in one location

    • Cost & time-efficient – can’t feasibly gain consent from everyone & change in policy or guideline

    • Limitation – requires larger sample size to maintain power

    When to use blocked randomization
    When to Use Blocked Randomization? Example?

    • When employing group interventions

    • Control flow into different conditions

    • Assignments made for smaller units, such as in blocks of 4, 6, etc.

    Blinding Example?

    • Controls for potential bias, specifically reactivity of client &/or provider

    • Difficult to do in community–based psychosocial interventions

    • Possibly blind data collectors

    Randomization in practice
    Randomization in Practice Example?

    • Assignment occur after consent & baseline assessment completed

    • Random assignment not in hands of providers or even research workers

    • Procedures for random assignment centrally controlled to protect against subversion

    How to design recruitment sampling strategy
    How to Design Recruitment & Sampling Strategy Example?

    • Need to demonstrate can consent & maintain sufficient sample size for analysis

    • Need to determine at what point in pipeline feasible & conceptually justified to recruit

    What inclusion criteria to consider
    What Inclusion Criteria to Consider? Example?

    • Need to operationally define inclusion & exclusion criteria

    • Consideration & implications of criteria, e.g., new intakes or current clients, i.e., current testing TAU + exp. int.

    • Consideration of age, diagnoses, language, & geography

    What exclusion criteria to consider
    What Exclusion Criteria Example?to Consider?

    • Vulnerable populations

    • Co-morbid or specific disorders

    • Specific system status levels

    • Frequently, no exclusion criteria

    Considering sample method recruitment process
    Considering Sample Method & Recruitment Process Example?

    • Frequently use consecutive samples

    • Combination of purposive, snowball, & quota samples

    • Agency staff vs. research staff doing recruitment

    Determining sample size
    Determining Sample Size Example?

    • Determining effect size

      • Prior research – literature

      • Pilot studies

    • Estimating attrition

      • Prior studies

      • Pilot studies

    Operationalizing experimental control interventions
    Operationalizing Experimental & Control Interventions Example?

    • Need to clearly specify all conditions

      • Experimental interventions – manualized/tool kit – clearly specified intervention

      • Justify that exp. & control conditions truly differ

      • Need to operationalize TAU & benign interventions

      • If exp. longer or more intense (dosage) than control – time &/or amount may be variables effecting outcome

    Outcome measures data points
    Outcome Measures & Data Points Example?

    • Need psychometrically sound measures – unreliable measures reduce power

    • Valid measures for sample

    • Sensitive to capture change in short time frame

    • Some concepts unlikely to change in short time frame

    • Justify time period for data points

    Approaches to data analysis
    Approaches to Data Analysis Example?

    • Expected to use Intent-to-treat analysis

    • Avoid temptation of eliminating participants receiving limited service

    • Dosage effect variables – fidelity, compliance, adherence, & engagement

    • Carefully conceptualize dosage effect so do not substitute for main independent variable

    • With enough data points can estimate missing data

    • Statistical consultant early in design process

    Concerns with rcts
    Concerns with RCTs Example?

    • Highly selective samples

    • Preferences interacting with actual service delivered

    • Complex interventions not accounting for accumulative effects of all service components

    Alternative designs to rcts
    Alternative Designs to RCTs Example?

    • Fixed adaptive designs

      • randomly assigned to condition, but progress through intervention determined by intensity of treatment need

    • Randomized adaptive designs

      • changes in service condition are done by randomization to choices of participant or provider

    Alternative designs to rcts1
    Alternative Designs to RCTs Example?

    • Encouragement or randomized consent trials

      • Encouraged to participate in one service option or other, but constricted to the selected option

    • Randomized preferences

      • Participants decide whether they will be randomized or choose their service option

    Preparing setting for rct
    Preparing Setting for RCT Example?

    • Inform setting with time for preparation, but not so far in advance that forgotten

    • Research & Agency jointly decide on how & when to inform personnel

    • Jointly present RCT with administrators & staff

    • Need to sell RCT on benefits to setting, providers, & clients

    • Don’t oversell what can’t be delivered

    • Understand provider’s perspective

    Preparing setting for rct1
    Preparing Setting for RCT Example?

    • Sensitivity to language & examples employ

    • Turn lack of clarity into an advantage

    • Anticipate questions & issues & raise them first

    • Address random assignment in straightforward manner

    • Try to counter negative momentum

    • Positive frame of mind critical – “you need them more than they need you”

    Tracking participants
    Tracking Participants Example?

    • At enrollment participant complete locator form – working document of all info to help find someone including:

      • Demographic & identifying info

      • Relatives, info from multiple people at different locations

      • Professional contacts for contact info

      • Incidental contacts

        • e.g., where one goes when out of money, or hungry, or where one sleeps when homeless

    Tracking participants1
    Tracking Participants Example?

    • Working document

      – update every time contact participant

      - indicate helpful & unhelpful info

      - offer incentive for participant to contact researchers with change of info

    • Computerized system to generate timely lists for follow-up data points

    Monitor recruitment
    Monitor Recruitment Example?

    • Use track system, monitor recruitment to ensure accruing sample to meet timely projections

    • Big push late in study resulting in non-completers of intervention &/or outcomes

    • Relying totally on providers is usually ineffective

    • Creative means to control recruitment within confidentiality & legal policies

    • Remember providers usually do not want responsibility for recruitment

    Referral process
    Referral Process Example?

    • Providers make referrals, but best they not do eligibility determinations or consents

    • Providers obtain Release of Information form from potential eligible participants

    • Provider referrals based on easily observable or obtainable criteria (using system categorization), & casting wide net

      • Lessens burden on providers

      • Providers more likely to do

    Ensuring participant retention in research
    Ensuring Participant Retention Example?in Research

    • Collect complete locator info at study entrance

    • Inform participants when they will be followed up

    • Review locator information at subsequent data collection points

    • Offer adequate incentives

    • Employ effective research data collectors

    Ensuring participant retention in research1
    Ensuring Participant Retention Example? in Research

    • Document all follow-up activities in detail

    • Exploit contact information obtained

    • Reasonably accommodate participant for follow up data collection

    • Allocate enough resources for travel

    • Allow ample time for tracking down participants

    Ensuring participant engagement in intervention
    Ensuring Participant Example?Engagement in Intervention

    • Communicating importance of intervention

    • Outlining benefits & expectations of participants

    • Making minor modifications

      • e.g., reducing # of sessions if too many dropping out

    • Training providers (both conditions) in engagement, retention, & relationship building

    • Building trust

    • Incorporating outreach efforts as part of intervention

    • Novel thinking

      • e.g., giving up professional offices

    Qualifications training of providers
    Qualifications & Training Example?of Providers

    • Equality of qualifications for all conditions – otherwise confounding

    • Training of Experimental Providers

      • Human subject protections

      • Overview & purpose of RCT

      • Conceptual basis of RCT

      • Design of RCT

      • Appealing argument for need for random assignment

      • Operation & implementation of RCT

    Training experimental providers
    Training Experimental Providers Example?

    • Introduction to intervention

    • Program philosophy

    • Program goals & principles

    • Practice experience delivering intervention

    • Role modeling with target population

    • Review manual/toolkit, etc.

    • Using existing training material if available

    • Consider hiring trainer – control for potential bias

    • Consider on-going support, coaching, booster sessions

    Training experimental providers1
    Training Experimental Providers Example?

    • Training involves engaging, teaching, & supporting in performance of intervention

    • Training ensures fidelity of intervention

    • Provision for new hires

    • Supervisors need to be trained

    • Supervision/monitoring of exper. providers best done by those with investment in RCT- e.g., research staff

    Training monitoring control condition
    Training & Monitoring Example? Control Condition

    • Less involved than experimental condition

    • Training in eligibility determination

    • Training in completion of fidelity / leakage forms

    • Researcher monitor fidelity / leakage forms to take corrective action

    Training supervising research staff
    Training & Supervising Example?Research Staff

    • Rationale for RCT

    • Overview of RCT

    • Human Subject Protection

    • Recruitment procedures

    • Randomization process

    • Review of all data collection forms

    • Training in experimental intervention if providing ongoing support & technical assistance

    Fidelity assessment1
    Fidelity Assessment Example?

    • Time points – developmental & mature phases of intervention

    • Provider & client perspective

    • Data sources

      • Billing data

      • Treatment/activity logs

      • Attendance records

      • Site visits

      • Ethnographic methods

        • e.g., shadowing

    Assessing environmental context
    Assessing Environmental Context Example?

    • Systematically tracking organizational changes

      • e.g., policy, eligibility requirements by dates

    • Use of quantitative & qualitative methods

    • Importance as participants will be served over time – not all served at same point in time

    Implementation disaster
    Implementation Disaster Example?

    • To test effectiveness of self-help for persons with severe mental illness

    • Roster of 1185 clients from an urban CMHC who received tx in past 2 years; 853 met eligibility; decreased to 241 due to hosp, participation in self help, etc, but 90 consented, completed data, & were randomly assigned

    • Inclusion criteria:

      • Dx. schiz., schizaffective, or major mood

      • Normal intelligence

      • Not participated in self help

    Implementation disaster1
    Implementation Disaster Example?

    • Both groups monitored for self help attendance to assess contamination

      • reviewed daily sign in sheets of self help group

    • 17% of both conditions participated in self help

    • Self help has selective rather than universal appeal

    • Outreach efforts minimally affected participation

    • Self-selection tremendous impact on sample size in self help research – not likely to recruit adequate randomized sample with no prior exposure to self help from a single geographical area

      (Kaufman, Schulberg, & Schooler, 1994)

    Final note

    Final Note Example?

    Final note1
    Final Note Example?

    • Generalizability to other service settings

    • Sustainability of intervention in research setting

    • Transparency of reporting RCTs

      • CONSORT: checklist & flow diagram

    • If experimental intervention effective, cost effectiveness important – but need to design at beginning of study, not as an after thought