1 / 76

Creating a Culture of Outcomes Management ( OMGmt ): Showing What Works

Creating a Culture of Outcomes Management ( OMGmt ): Showing What Works. National Social Services and Disaster Management Conference March 26, 2014. Presenters. Major Lewis R. Reckline, Area Commander, National Capital Area Command in Washington, DC

camdyn
Download Presentation

Creating a Culture of Outcomes Management ( OMGmt ): Showing What Works

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Creating a Culture of Outcomes Management (OMGmt): Showing What Works National Social Services and Disaster Management Conference March 26, 2014

  2. Presenters • Major Lewis R. Reckline, Area Commander, National Capital Area Command in Washington, DC • Leslye E. Wooley, Area Command Director of Program Services, Washington, DC

  3. Overview of theNational Capital Area Command • Area Command was created in 2006 • Encompasses the Army’s programs in the following areas: • Two suburban counties in Maryland – Montgomery and Prince George’s County • Washington, DC • Northern Virginia – City of Alexandria, Arlington, Fairfax, and Prince William Counties

  4. Creation of the Area CommandThe Strengths • Increased visibility of The Salvation Army across the region • Increased opportunities to expand programs, provide additional services across the entire region • Increased the viability of our fundraising • Increased the influence of a regional Advisory Board

  5. Creation of the Area CommandThe Challenges • Increased scrutiny of the effectiveness of our services • Increased need for funding, especially new funding streams • Increased emphasis on reporting our program outcomes

  6. Have you or your staff ever looked like this at the mention of outcomes?

  7. Tell the truth… Have you or your staff ever… 1.) broken out into a cold sweat when a grant application asked for outcome information? 2.) laughed manically when a donor asked, “what is your success rate?”

  8. Tell the truth…continued 3.) looked blankly at your Development Department colleagues when they ask you for a logic model for your program? 4.) cursed an Advisory Board member who asked “tell us about a ‘typical’ client?”

  9. Tell the truth, continued • If you answered yes, to any of these questions, you are not alone. • This presentation is the Area Command’s story of how it came to see data as a friend, not an enemy.

  10. Recognizing the Importance of Outcome Measurement “Why did we need to create a culture of data collection and outcome measurement?” AND “Who cares about outcomes anyway?!”

  11. Why did the Area Command choose to focus on measuring outcomes? • Increased accountability/stewardship – are our programs the best that they can be? a. Are we Doing the Most Good for clients? With donor dollars? • To answer, “what is your success rate?” • What gets measured, gets improved. • Data backs up our anecdotal evidence. • The data validates the work that we do. • The data shows us our areas of weakness.

  12. Why did the Area Command choose to focus on measuring outcomes? • Change in direction/emphasis by funders (example: homeless services). 4. To make informed policy decisions and program changes. a. If the program is not working, what changes need to be made? Can it be fixed or… b. Does the program need to end?

  13. Why did the Area Command choose to focus on measuring outcomes? • To inform stakeholders of our successes. a. Donors – leads to increased donations; b. Funders – leads to increased funding streams; c. Staff – what they are doing is having a positive impact on program participants; d. Program Participants – increased buy in because “this a good program that can really help me.”

  14. Who cares about outcomes anyway? • Funders – nearly every grant requires information about outcomes from the programs that are applying. Then funders award funding to programs with proven results, “more bang for the buck.” • Donors – more major and smaller donors also want to know the outcomes of our programs. They want to know The Salvation Army really is “doing the most good” with their donation.

  15. Who cares about outcomes anyway? Continued 3. Policy makers – want to know which interventions work best in order to replicate successful programs, end unsuccessful programs, and (hopefully) increase the funding for those with the strongest outcomes. 4. Stakeholders – including program participants themselves, Advisory Board members, etc. 5. Ourselves/Staff – does our program do what we want/need it to do? Did our intervention produce change? (And, if yes, was it the change we intended?)

  16. Outcomes Measurement 101 • Pre-Test • what was our existing culture around data collection and outcomes measurement at the outset? • Intervention • Creating a dedicated staff position for outcomes measurement and outcomes management. • Post-Test • reviewing the outcomes, disseminating the outcomes, and making program changes to improve the outcomes.

  17. The Pre-Test: Assessing Our Existing Culture Around Outcome Measurement (or Lack Thereof)

  18. The Pre-Test Assessing Our Existing Culture Around Outcome Measurement (or Lack Thereof) Early Attempts – 2010-11 1. Emergency Rental and Utility Assistance 2. Angel Tree 3. Turning Point Center

  19. The Pre-Test: Emergency Assistance

  20. The Pre-Test: Emergency Assistance • Providing emergency financial assistance for rent, mortgage, and utility arrearages to prevent greater crises—e.g. homelessness prevention. • What happened to the client after we provided financial assistance? WE HAD NO IDEA.

  21. The Pre-Test: Assessing Our Existing Culture Around Outcome Measurement (or Lack Thereof), cont.

  22. The Pre-Test: Angel Tree What did we want to know? 1. What are the characteristics of the “typical” family who is registering for Christmas assistance? 2. How can we use information about the families to market the program and increase the number of (corporate) donors who sponsor angels at Christmas?

  23. The Pre-Test: Angel Tree, cont. What data and reports did we have already? Using a common database, we could pull reports for: 1. Number of clients per zip code 2. Number of children by age and gender Doesn’t paint much of a picture…

  24. The Pre-Test: Turning Point Center

  25. The Pre-Test: Turning Point Center The Turning Point Center is a two year transitional housing program for young women, between the ages of 18 and 30 years old, with up to four children who are homeless. Families may stay at TP for up to twenty-four months.

  26. The Pre-Test: Turning Point Center • Changes in emphasis within homeless services forced our hand • Reduction/elimination of funding for transitional housing programs • Increased funding/emphasis on rapid re-housing and permanent supportive housing programs How could we use outcomes to strengthen our position as a TH provider in a PSH/RRH world?

  27. The Pre-Test: Turning Point Center • Needed to be able to show why our transitional program was relevant and necessary • Serve a population (young women with children) that needs transitional housing b/c many barriers to maintaining permanent housing • Show success in reducing lengths of stay, increasing numbers served, and better exit destinations, and better outcomes.

  28. The Pre-Test: Turning Point Center THE DREAM (2010) Proposed outcomes (based on former HUD SHP outcomes) e.g. what we were putting in our grant proposals: 1. Increased education and income 2. Exit to permanent housing 3. Increased self-determination

  29. The Pre-Test: Turning Point CenterTHE REALITY First Attempt: Using the data we knew we could get quickly, we asked the following: a. Type of Exit - how did our clients leave? b. Exit Destination - where did they go?

  30. Turning Point Outcomes Data – First Attempt 2010

  31. The Pre-Test – Turning Point Center The Results (reported to TP staff and the newly created Advisory Board Program Committee): • 50% left for positive reasons • 50% left for negative reasons • Lengths of stay ranged from 46 months to 1 month. Average stay - 14 months. • Exit destinations were unclear; no uniform definition for the exit destination. Honesty time…these were not good.

  32. The Pre-Test – Turning Point Center Additional Questions We Wanted Answered: 1. How long were families staying (supposed to be a twenty-four month program)? 2. Were HoHs employed before, during, after program? 3. Were household incomes increasing? 4. Which ILS classes were most effective? 5. Were education levels changing?

  33. Conclusion of the Pre-TestWe had the will to measure outcomes, but not the means.

  34. The Intervention: Creating the Culture What did we need to do to create a culture of data collection and outcome measurement?

  35. The Intervention: Creating the Culture

  36. The Intervention: Creating the CultureContinued

  37. Intervention: Our First 18 Months What was our Program Outcomes Coordinator able to accomplish? (WHAT?) What did the data tell us/ what outcomes did we have? (SO WHAT?) What did we do with the information once we had it? (NOW WHAT?)

  38. Intervention: What did our Program Outcomes Coordinator accomplish?1. Benchmarking

  39. Intervention: What did our Program Outcomes Coordinator accomplish? • Defined who the “typical” client is using client data (v. anecdotal stories). Example: Typical Angel Tree Family • Average Number of Children - 2.5 • Majority of Head of Household are Single Parents – 63.41% • Average Monthly Income - $1,135 ($13,620/yr)

  40. Intervention: What did our Program Outcomes Coordinator accomplish?3. Developed logic models for each program.

  41. Intervention: What did our Program Outcomes Coordinator accomplish?4. Established outcome measures per program. • Emergency Assistance • 30, 60, 90 day follow up asked two questions: 1. Are you still housed or is utility still on? 2. Is your rental or utility balance current?

  42. Intervention: What did our Program Outcomes Coordinator accomplish?4. Established outcome measures per program. • Turning Point Center • Reason for Exit • Destination at Exit • Changes in: • income • education • employment • housing barrier assessment • Self-Sufficiency Matrix score • Independent Living Skills mastery

  43. Intervention: What did our Program Outcomes Coordinator accomplish?5. Developed tools to measure those outcomes. A. Emergency Assistance 30/60/90 day follow up – online tool, staff clicks on answers as they speak with clients, then Coordinator can view results immediately

  44. Intervention: What did our Program Outcomes Coordinator accomplish?5. Developed tools to measure those outcomes. • Angel Tree Zip code analysis overlay with percentage of residents living at or below FPL in that zip code.

  45. Zip Code Analysis Overlay with FPL

  46. Zip Code Analysis Overlay with FPL

  47. Intervention: What did our Program Outcomes Coordinator accomplish?5. Developed tools to measure those outcomes. C. Turning Point Center • Developed entry/exit tools for Turning Point to track changes in income, employment, Self-Sufficiency matrix scores, ILS scores, Out of Poverty pre- and post-test, etc. • Developed standardized definitions for reasons for exit and exit destinations.

  48. The Tools • Examples of • Entry/Exit Form • Reason for Exit/Exit Destination Codes

  49. Post-Test: Now What? 1. Reviewing the outcomes 2. Disseminating the outcomes 3. Making program changes based on the outcomes in order to improve the outcomes

  50. Reviewing OutcomesEmergency Assistance “Is there any other type of assistance or services your family needs at this time?” • Most Common Responses • Need Legal Assistance • Landlord that won’t repair broken items • Medical Bills/Affordable Health Services • Usually also resulted from job loss

More Related