1 / 23

The Challenges of Evaluation in Times of Budget Cuts

This paper discusses the evaluation of state substance abuse systems in Virginia, specifically the State Performance Outcome Measurement System (POMS) and the Substance Abuse Client Automated Data System (SCADS). It explores the objectives, data collection process, and products of these systems, as well as the reasons for their failure. The paper highlights the impact of budget cuts and the challenges faced in implementing and maintaining evaluation systems during times of financial constraints.

jlorenzen
Download Presentation

The Challenges of Evaluation in Times of Budget Cuts

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. TO CONTINUE OR NOT TO CONTINUE EVALUATION IN TIMES OF BUDGET CUTS? Minakshi Tikoo, Ph.D. Virginia Department of Mental Health, Mental Retardation, and Substance Abuse Services Canadian Evaluation Society

  2. History of Evaluation of State SA Systems in Virginia • 1992 - Statewide Client Automated Data System (SCADS) – Expanded version of the Treatment Episode Data Set (TEDS Plus) • 1996 - State Performance Outcome Measurement System (POMS) process (2000-2002) • 1998-2002 - Treatment Outcomes and Performance Pilot Studies II (TOPPS II) • 2002 - SCADS Plus or TEDS Plus Canadian Evaluation Society

  3. Overview of the CSB System • 40 CSBs and 15 state facilities • CO does not directly operate the CSBs • There are 8 different MIS used by the CSBs • CMHC (18) Medical Manager (2) • Anasazi (7) Wil Data System (2) • BTI (6) In-house (1) • CSM (3) Psych Consult (1) Canadian Evaluation Society

  4. POMS Primary Objectives • The Virginia POMS was intended to serve two major purposes: • To function as a tool for improving the quality of services for persons with mental disabilities and substance abuse problems (with mental retardation and SA prevention issues planned for subsequent phases), and • To improve accountability for taxpayer dollars. Canadian Evaluation Society

  5. Data Collection for POMS • Criteria for selection • Classification as a member of the priority population • Standardized Instruments • Composite score questions from the ASI • MCAS • CAFAS • Outcome data elements (8 face-valid indicators) • Data was collected at intake, discharge, and at six-month intervals Canadian Evaluation Society

  6. Data Collection Process • Data is collected by an administrative staff or a clinician (CSB decision) • Data collected on paper mostly. • We received a monthly extract from the local MIS • Each CSB was provided $40,000 for POMS • May not have been the best decision??? Canadian Evaluation Society

  7. POMS Sample 10/00 to 6/02 Canadian Evaluation Society

  8. Sample between 10/00 to 6/02 Canadian Evaluation Society

  9. Data Processing • Research and Evaluation Unit created and managed the centralized data. • Each region was assigned to a staff person who identified the errors in the database, ultimately the CSB was responsible for data cleaning Canadian Evaluation Society

  10. POMS Products • Monthly error reports generated at the department and emailed to the 40 CSBs • An annual report • Starting 2002, quarterly reports • The SA division sent out additional reports about outcomes by demographic variable, and with comparative data for their region and the state • Data is still being reported and disseminated via a newsletter that was started in May 2003 Canadian Evaluation Society

  11. SCADS Objectives • To meet the requirements of the Substance Abuse Prevention and Treatment Block Grant • To enable state systems to report on performance outcome measures • Provide data that can be used to improve state plans to address quality of services Canadian Evaluation Society

  12. SCADS/TEDS: Data Collection • Data collected since 1992 • Data is submitted electronically on a quarterly basis • Oct 2001 attention was paid to the dataset • Oct 2002 all CSBs were reporting data, though the quality of the data is poor • January 2003 started sending report back to the CSBs Canadian Evaluation Society

  13. SCADS/TEDS • April 2003 hired a programmer to code data to automate report generation • June second set of quarterly reports will be send to the CSBs in a timely fashion for the first time in a decade • June 2003 the CSBs with poor quality of data have to submit a corrective action plan that identifies a timeline Canadian Evaluation Society

  14. SCADS Products • Quarterly error reports generated at the department and emailed to the 40 CSBs • Calculation of performance measurement indicators on a voluntary basis for BG funds Canadian Evaluation Society

  15. SCADS Submission Canadian Evaluation Society

  16. SCADS Submission Canadian Evaluation Society

  17. Why POMS failed? • The anxiety associated with the system was paramount. Our process demonstrated signs of ‘excessive anxiety’ (Donaldson, Gooler, Scriven, 2002) • Conflict (hidden agendas) • Withdrawal (refusing to work with evaluators) • Resistance (stalling, protesting) • Shame ?? • Anger (once the budget was cut) Canadian Evaluation Society

  18. Why POMS failed? • We tried some of the methods recommended for reducing anxiety, it was “too little too late” • Providers had different levels of ‘local capacity and commitment’ (Fredericks, Carman, & Birkland, 2002) • Multiple stakeholders • Fears about cross-site comparisons Canadian Evaluation Society

  19. Why POMS failed? • Too many items • Past experiences of the CSBs with data submission had not resulted in any formal or informal feedback, and that perception was very difficult to change • POMS was built as a new system and did not integrate some of the existing data collection initiatives Canadian Evaluation Society

  20. Why POMS failed? • Data collection was always perceived as a burden, though by discontinuing POMS, the reduction in burden was 5 elements • The POMS committee should have worked in conjunction with another committee that had been formulated to reduce the number of data requests by the department by half • Currently 25 different datasets are being used for data submission • Total number of elements exceeds 800 Canadian Evaluation Society

  21. Why SCADS will succeed • It has a limited number of data elements • Data is collected at two-points, admission and discharge • The vendors maintain it and any required change is either free or relatively inexpensive • The OSAS has invested a lot of energy in rejuvenating an existing system Canadian Evaluation Society

  22. Why SCADS will succeed • It still continues to be a federal requirement • CSBs are getting regular reports and feedback • So far 11 of the 40 CSBs have been visited Canadian Evaluation Society

  23. Contact Information Minakshi Tikoo, Ph.D. Manager Research and Evaluation Office of Substance Abuse Services VA DMHMRSAS 1220 Bank Street Richmond, VA 23218 804-225-3394 (phone) 804-786-9248 (fax) mtikoo@dmhmrsas.state.va.us Canadian Evaluation Society

More Related