1 / 28

DEPARTMENT OF DEFENSE

DEPARTMENT OF DEFENSE. Challenges For FAP in “The Way Ahead” August 10, 2009. Challenges. Demonstrating effectiveness Implementing promising, good & best practices Improving accuracy of data collection Creating joint bases Improving DoD’s response to domestic abuse and child abuse

auryon
Download Presentation

DEPARTMENT OF DEFENSE

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. DEPARTMENT OF DEFENSE Challenges For FAP in “The Way Ahead” August 10, 2009

  2. Challenges • Demonstrating effectiveness • Implementing promising, good & best practices • Improving accuracy of data collection • Creating joint bases • Improving DoD’s response to domestic abuse and child abuse • Revising the Case Review Committee process • Addressing the Reserve Component • Addressing Wounded, Ill, and Injured service members treatment for family violence

  3. Demonstrating Effectiveness • It’s a “best practice” • It’s superior to alternative approach(es) because it • Produces better results, or • Produces equal results more efficiently/cheaply/in accord with values & culture • It’s not superior, but it’s a “good” practice • So far, it’s only a “promising” practice

  4. Demonstrating Effectiveness • It’s a promising/good/best practice because it: • Is designed on a logic model • Replicates/builds on evidence-supported practice • Has positive results from program evaluation • Has ongoing systematic data collection and analysis

  5. Implementing Best Practice (2) • Build a logic model • Define desired immediate, intermediate, & long-term outcomes • Identify and assess strategies/activities that may produce them • Are they available? Are they appropriate? • Ascertain how/why would they produce them • Theoretical foundation • Results of prior research/evaluation • Identify and assess what data will measure success/failure • What data collection methods are available? • What analysis needs to be performed?

  6. Implementing Best Practice (3) • Incorporate Evidence-Supported Practice (ESP) • ESP combines: • Best research evidence with • Best clinical experience that is • Consistent with family/client values

  7. ESP Categories • In declining order: • Well-supported by research evidence • Supported by research evidence • Promising Research evidence • Failure to demonstrate effect • Concerning practice • Not able to be rated California Evidence-Based Clearinghouse for Child Welfare http://www.cachildwelfareclearinghouse.org

  8. ESP factors: • No empirical/clinical evidence or theoretical basis indicating substantial risk of harm, compared to likely benefits • Book, manual, other writings describing protocol • Form of control to show benefit of practice over placebo • Randomized controlled trial (RCT) • Untreated group/placebo group/matched wait list group • Reliable and valid outcome measures applied consistently and accurately • Duration of sustained effect • Publication in peer-reviewed professional literature • Replication • If multiple outcome studies, overall weight supports benefit of the practice http://www.cachildwelfareclearinghouse.org

  9. Implementing Best Practice (4) • Plan evaluation before starting • Document current protocol & outcomes • Select new model’s data collection strategy & select instruments • Pilot test new protocol • Ensure fidelity to model • Prepare and disseminate protocol manual • Train staff in protocol • Monitor fidelity of experimental and control groups to respective protocols

  10. Implementing Best Practice (5) • Collect data • Conduct edit checks • Analyze and draw conclusions • Identify applicability and limitations of study • Prepare findings for publication • Cooperate with replications

  11. Importance of Edit Checks • Errors can have consequences • 2 errors on substantiated incidents per installation change: • USA rates: • CAN rate +/- by 0.54 • DA rate +/- by 0.83 • DoD rates: • CAN rate +/- by 0.1 • DA rate +/- by 0.2 • Can lead to erroneous interpretations

  12. Force Well-Being Scales • “Are the wheels coming off?” • Semi-annual lagging indicators • Risk Behaviors • Spouse abuse by AD personnel • Child abuse/neglect • By AD personnel • By AD parent • By civilian parent • By combined parents • Compare current half year to half year in FY 2000

  13. Force Well-Being Scales • (Effects of Data Errors on Army and DoD rates)

  14. Hickam AFB Andersen AFB Bolling AFB McChord AFB NWS Charleston Fort Dix& NAES Lakehurst NAF Washington Fort Richardson Randolph AFB & Fort Sam Houston Fort Eustis NS Pearl Harbor NB Guam NSA-W Anacostia Annex Fort Lewis Charleston AFB McGuire AFB Andrews AFB Elmendorf AFB Lackland AFB Langley AFB Implementing Joint Basing

  15. Implementing Joint Basing (2) • Supported installation’s program integrates into supporting installation’s program • FAP construed as “base support function” NOT “mission support function • FAP standardized services approximate COLS

  16. MOU Personnel Issues to Implement Joint Basing • Personnel billets/positions • Military FAOs & SWs • Civil Service & NAF • Contractors • Installation or centralized contract • Seniority/priority placement • Credentialing process • Location • Supervision • Funding • FY 2010 PBAS, MIPRs and other temporary “fixes” • FY 2011 PBAS

  17. Changing the Case Review Committee Process • Purposes • To reduce variability in decision-making • Improve quality of data in Central Registry • Improve fairness • Improve FAP’s reputation • To promote a coordinated community response • Promote command and investigative agencies’ responsibilities • Refocus FAP to clinical work • To ensure respect for privacy rights • To improve efficiency

  18. The New CRC: CCSM+IDC • Clinical Case Staff Meeting (“CCSM”) • Safety planning and action • FAP assessment and treatment planning • Occurs ASAP • Incident Determination Committee (“IDC”) • Administrative decision: Does incident meet criteria for incident to be entered into Central Registry with personal identifiers? • FAP communicates treatment plan to unit commander

  19. The New CRC: CCSM+IDC • Incident Determination Committee (“IDC”) • Chaired by senior commander • Composition limited to those with relevant information for determination • Preclude discussion of irrelevant information, especially information protected by privacy rights

  20. Incident Determination CommitteeJoint-Service Criteria for CR • To be entered into FAP Central Registry with personal identifiers incidents must have: • Act (or failure to act) plus • Harm (except for sexual abuse) with specific thresholds • Actual injury • Reasonable potential for injury • Acute significant fear reaction • Criteria have demonstrated validity and reliability • Exclusions reduced by raising harm threshold

  21. The New Process • Enhances command role • Unit commander pre-IDC action for victim safety • Unit commander pre-IDC administrative/ disciplinary action, as appropriate • Higher level commander chairing the IDC promotes: • Prompt attendance • Preparation • Focused attention • Protection of privacy rights

  22. Addressing Domestic Abuse • Increase OSD FAP funding for victim advocates • GAO Study • Civilian advocates’ issues • Legislation for an OSD Office of the Victim Advocate • Coordinated community response projects • Availability and training of law enforcement • Command priorities • Enhanced visibility of command actions

  23. Data collection in the Reserve Component • How much family violence occurs in the Reserve Component? • Civilian child abuse data collection process is problematic • OSD working with HHS on NCANDS • No civilian data collection system for domestic abuse • No public agency system • Can’t require nonprofit agencies to collect data

  24. Addressing Family Violence in the Reserve Component • Expanding access to FAP prevention services • Web-based materials • Ad Council public awareness campaign • Civilian home-visiting programs • Intervention • Availability of resources • Restricted reporting • Line of duty issue

  25. Data Collection Involving Wounded, Ill & Injured (WII) Service Members • How much family violence occurs in WII? • FAP is exploring joint research, matching databases • FAP Central Registry • Deployment to hostile areas • Wounded/Injured • Mental health problem

  26. Addressing Family Violence in WII Service members • Protocol for reporting family violence in Warrior Transition Units • Intervention/treatment challenges • Civilian advocates urge DoD to emphasize criminal justice approach • Is this the right approach if domestic abuse arises after combat-operational stress and/or PTSD? • Coordination with the Veterans Administration

  27. Domestic Abuser Treatment for WII Personnel • Are State standards for “batterer treatment” ESP? (Low rates of success) • One possible reason: “one size fits all” group psycho-education by unlicensed professionals • Psycho-social assessments not incorporated • Doesn’t incorporate dual diagnosis problems • No therapeutic alliance to change behavior • FAP is proposing range of modalities by licensed professionals • Individualized assessment • Motivation to change

  28. Treatment of Abusers with Depressive Disorders & PTSD • Treatment should address: • Anger and impulse control • Self-medication with alcohol • Low self-esteem • Controlling others to maintain safe environment • Addiction to risk, especially in PTSD • Anxiety • Need to coordinate FAP treatment with: • Mental health treatment • Substance abuse treatment

More Related