1 / 24

Presentation to the CCEH Annual Training Institute 5/8/14

Building a Sustainable Coordinated Access System Through the Use of Best Practices and Critical Data Elements. Presentation to the CCEH Annual Training Institute 5/8/14. Coordinated Access.

feo
Download Presentation

Presentation to the CCEH Annual Training Institute 5/8/14

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Building a Sustainable Coordinated Access System Through the Use of Best Practices and Critical Data Elements Presentation to the CCEH Annual Training Institute 5/8/14

  2. Coordinated Access Design a coordinated system of triage, assessment, and entry that effectively moves people out of homelessness quickly by providing increased access to appropriate housing interventions.

  3. Connecticut Coordinated Access Network (CAN) Template Client /Family in Housing crisis Shelter Diversion 1.First effort: shelter diversion Client CAN Shelter Client calls 2-1-1 2-1-1 If client needs not met with non-housing resources, complete basic screen and refer client for CAN intake HOT Team 2. If diversion not possible, complete assessment (VI-SPDAT or similar ) and refer to shelter RRH TH 3. If shelter is needed, but not available, refer to HOT team Diversion to non-housing resources 4. Shelter/HOT team refers client to appropriate housing resource DV Services If DV case, refer to DV provider PSH

  4. “One of the great mistakes is to judge policies and programs by their intentions rather than their results” Milton Friedman

  5. During This Workshop We Will: • Discuss principles of evaluation research to document coordinated access activities and outcomes • Learn how to utilize data to inform systems change • Examine preliminary process and outcome data from coordinated access efforts in CT • Introduce tools used to evaluate effectiveness of coordinated access systems • Take a preliminary look at what our new HMIS can do to help us track and evaluate our coordinated access efforts

  6. 4 Types of Program Evaluation Process Evaluation- assesses whether a program or system is operating as it was intended to Outcome Evaluation- assesses whether a program is achieving the goals it intended to Impact Evaluation- assesses program outcomes against what would have happened if the program was not in place (ex.- randomized control trial) Cost-benefit analysis- identifies all relevant program costs and benefits (in financial terms) to determine whether the program is cost effective

  7. Evaluation Cycle

  8. Planning the Evaluation • Who Should Be Involved? • Housing providers (leaders and front line staff) • HMIS lead • CCEH • 211 • Health care providers • Behavioral health care providers • Representatives from the school system • Consumers • Others? • Other Considerations: • Who should manage and oversee the evaluation process? • How can we get program buy-in? • Who is responsible for analyzing and reporting out on the data? • How are results funneled back to the system?

  9. Identifying Our Research Questions for Coordinated Access • What does a successful coordinated access system look like? • How many people will access the system • How many will be diverted from shelter • Number referred for shelter • Number enrolled in shelter • Number who access system multiple times in a specified timeframe • How will clients benefit? • Shorter period of time to shelter entry • Rates of homelessness declining • Shorter periods of time homeless • Less chronic homelessness • Percentage diverted who come back into the system • How will agencies benefit (efficiency, collaboration) • How does coordinated access impact staff time spent on answering phones, intakes, etc. • How well are agencies collaborating Results Based Accountability- research questions should address the following domains: How much, How well, Is anyone better off

  10. Choosing performance indicators and critical data elements • How do we operationalize our research questions? • Clear and consistent definitions (ex. what is an episode of homelessness, categories for exit destination, chronic homelessness) • Assess what data we already collect through HMIS • HEARTH indicators • Shelter length of stay • New entries into homelessness • Repeat episodes of homelessness • Job and income growth • VI-SPDAT or other universal assessment tool

  11. Choosing performance indicators and critical data elements • What additional data should we collect? • How do we gather enough data to show meaningful outcomes, but not so much as to waste client and staff time? • What data are realistic to collect given our timeframe and resources? • Consumer surveys • Can qualitative data be collected and analyzed? • Interviews, focus groups, observations • This process can be very labor and cost intensive • How will data collection be coordinated?

  12. Collecting valid and reliable data • Data quality • Train, train, train • Ongoing monitoring • What if what we’re looking to collect is not in HMIS • Use of common database/ forms • HMIS

  13. ECM Reporting - SUR

  14. ECM Reporting • Other ECM reports: • Data Quality • Data Timeliness • APR • CAPER • PATH • SSVF • HOPWA • Services by Program • Clients Served

  15. Reporting - VITALS • VITALS (Valuable Information to Assess Local Systems) • Vision: Dashboard-style report • Relevant HEARTH measures • Web-based reporting portal • Aggregated information for programs and communities • Dynamic, drill-down capability • Level of Analysis • Program • Community (CoC / Sub CoC) • State • Currently under development

  16. Analyze data • Are there any comparison data available • Same system before coordinated access was implemented (PIT count) • Different, but similar community • What timelines are data pulled for? • Keep in mind that some outcomes are short term and some are long term • For process and outcome measures, break out results by subgroups to identify any trends • Gender • Race • geographical area • Age • household type (single vs. family) • Chronic homeless

  17. Communicate findings to stakeholders • Communication should be ongoing and involve staff at all levels of the program or system • Preliminary findings should be discussed along the way and not just at the end (this allows for changes to be made if things are not going in the desired direction) • All findings should be discussed- both positive and negative, intended and unintended • Limitations of the data must be discussed so that everyone is clear on what the data do and do not tell us • Create a system by which program changes can or will be made based on evaluation results • Discussion on which results are presented to the community at large

  18. What Have Data on Coordinated Access Efforts in CT Shown Us So Far?

  19. New London Coordinated Access: Families • Coordinated Access system has been in place for about 2.5 years • 211 is the primary point of contact • Process Data • In 2013, 948 calls to 211 • 54% were diverted from shelter • 46% were scheduled for appointments for shelter • About a 30% no show rate • 163 were diverted at the time of shelter intake • Outcome Data • 50% Reduction in family shelter units/beds since 2011 • In 2013, 44 families were rapidly rehoused • Average length of time in the shelter was 45.5 days

  20. New London Coordinated Access: Singles • System for single adults started November 2013 with 211 as the first, centralized contact • Process Measures • An average of 20 new intakes each week • Rate of diversion of 16% in New London and 46% in Norwich • Of the 743 calls that came in between 7/1/13-4/30/14: 66% enrolled in shelter, 16% diverted, 12% wait listed, 4%v admitted to Covenant, and 1% no need for shelter. • Shelter length of stay: 60% stayed 30 days or less • Preliminary Outcome Data • 47% of shelter exits were positive (permanent housing, treatment, or other stable housing situation)

  21. Bridgeport Preliminary Data • This provides an example of the way data can be used to track population need over a period of several years.

  22. Questions?

  23. Additional Resources What Gets Measured, Gets Done: A Toolkit on Performance Measurement for Ending Homelessness- ABT Associates http://www.endhomelessness.org/library/entry/what-gets-measured-gets-done-a-toolkit-on-performance-measurement-for- Coordinated Assessment Toolkit: Evaluation- National Alliance to End Homelessness http://www.endhomelessness.org/library/entry/coordinated-assessment-toolkit-evaluation

  24. Presenter Contact Information Meredith Damboise Director of Quality Assurance, New Haven Home Recovery mdamboise@nhr.org Kelley Traister Quality Assurance and Compliance Specialist, New Haven Home Recovery ktraister@nhhr.org Brian Roccapriore Director of HMIS and Strategic Analysis, Connecticut Coalition to End Homelessness broccapriore@cceh.org

More Related