1 / 63

Data, Data – Where is my Data?! Utilizing a Data Warehouse

Data, Data – Where is my Data?! Utilizing a Data Warehouse. Presenters: Jeff Cohen, Vermont IV-D Director Jeanette Savoy, Central Registry Supervisor, CDHS, CSE Keith Horton, Georgia IV-D Director. May 23, 2011, 1:30 – 3:00 p.m. Decision Support Systems:.

mhiga
Download Presentation

Data, Data – Where is my Data?! Utilizing a Data Warehouse

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Data, Data – Where is my Data?! Utilizing a Data Warehouse • Presenters: • Jeff Cohen, Vermont IV-D Director • Jeanette Savoy, Central Registry Supervisor, CDHS, CSE • Keith Horton, Georgia IV-D Director May 23, 2011, 1:30 – 3:00 p.m.

  2. Decision Support Systems: • Using Data to Improve Child Support Program Performance Jeff Cohen Vermont Office of Child Support

  3. Agenda Decision Support for Performance • Strategic Value • Decision Support Components • Reporting • Dashboards • Data Mining • Predictive Analytics • OCSE model data warehouse

  4. Performance Drivers: • Incentive Formula • Self Assessment • Strategic Plans • State Legislature • State Performance Audits • Public Expectations

  5. Building Blocks for Excellence Business Results Customer Focus Process Mgm’t Strategic Plan Human Resources Leadership Information and Analysis Information and Analysis

  6. Objective: Management by fact • ‘What gets measured gets done’

  7. Appropriate Data Representation by Function

  8. Decision Support System Data Warehouse ETL Data Sources Data Storage Presentation/ OLAP Services End User Access CS Case Data Marts * Data Warehouse Metadata Summary Detail End User App Extract Transform Load Welfare Report Writers Data Staging Ad Hoc Query DOC Data Mining Source Data Extraction Other Data Sources … Forecasting & Projections DSS Architecture

  9. Time • Dimension • Fiscal Year • Quarter • Month • Day • Demographic • Dimension • Income • Gender • Location • Education • Facts • Current $ • Arrears $ • # of Cases • Ownership • Dimension • Region • County • Worker • Case # • Case Status • Dimension • IV-D Type • Status Type Example Star Schema

  10. Some DSS Uses and Demos • Dashboard • Reports • Drilling • Data Mining • Predictive analytics • Linking

  11. Demos Here

  12. Screenshots

  13. Statewide Strategic_Plan_Parentage_Establishment_for_Region_and_Worker_1195.pdf

  14. Data Mining – The Problem The Objective Height Weight

  15. Lift: Random List vs Model-ranked list Cumulative % Hits % of cases reviewed 5% of random list have 5% of targets…

  16. Lift: Random List vs Model-ranked list Lift at 5% of list reviewed = 21% / 5% In other words, 4.2 times better than random Cumulative % Hits % of cases reviewed 5% of random list have 5% of targets… but 5% of model ranked list have 21% of targets.

  17. Colorado’s • Experience: • Business Intelligence Grant • Performance Dashboard • Presented by: • Jeanette Savoy, • Supervisor of the Colorado Central Registry

  18. Purpose • Synergistic relationship between compliance and performance • Initiate system caseload analysis capability using business intelligence tools • Replace monthly exception-based reports • Improve individual and county caseload performance

  19. Emphasis • Accurate representation of information • Clear understanding by CSE workers • Ability to drill down to case level to specify actions needed

  20. Project Goal The CSe-Tools Performance Dashboard will give CSE staff the tools to view caseload health and identify actions to help improve caseload management and program effectiveness, as measured by the four key performance indicators.

  21. Dashboard Design Principles • Keep it simple • Provide information quickly and clearly • Minimize distractions and unnecessary embellishments that can create confusion • Maintain consistency with the design to ensure accurate interpretation

  22. Project Development • Business Intelligence Workgroup • County, State and Federal representation • Review proposed solutions • Provide input on specific functionality • Elicit support, participation and cooperation • Project Development Team • Small group of programmers • Development of both data warehouse and performance dashboard

  23. Data Warehouse (Closet) • Provides appropriate information for the dashboard without overloading the main production database • “Warehouse” cost prohibitive • Initial “closet” to be expanded in incremental steps

  24. CSe-Tools • Browser-based application toolkit • Front-end application for statewide system • Interfaces with statewide system using web services and file transfers • Search and reporting capabilities • Drill down capabilities to case and financial detail information from the statewide system

  25. Performance Dashboard • Prominently displayed in the middle of the CSe-Tools homepage • Initial and immediate portrayal of caseload health on a single screen • Visual display of prioritized information • Ability to drill down to a list of cases that require action

  26. Performance Dashboard (cont.) • Specific to the following CSE Key Performance Indicators (KPI) • Paternity Establishment Percentage • Percent of Cases with Support Orders • Percent of Current Support Paid • Percent of Arrears Cases with a Payment KPI = quantifiable measurement that reflects an organization’s critical success factors

  27. Dashboard Format

  28. Work Lists • Based on logic of KPIs • Redesigned after implementation based on feedback from grant participants • Use of “tags” (colored dots) to identify a set of criteria indicating the type of action that may be needed

  29. “Tags” • Green • Blue • Black • Peach NCP employer is verified Wage withholding is active NCP employer is verified Wage withholding is inactive NCP in Department of Corrections NCP employer is not verified Reciprocal case is not initiating NCP address is verified

  30. KPI: Percent of Current Support Paid • The gold area shows the entire measure amount (i.e. -Total current support owed for worker’s YTD). • The vertical bar shows the goal. (i.e. - MSO goal for YTD caseload). • The parallel bar shows progress toward the goal (i.e. - Total MSO paid for YTD caseload). • Hover over each one to see numerical amounts.

  31. KPI: Percent of Current Support Paid

  32. Evaluation • Data analysis for • Percent of Current Support Paid • Percent of Arrears Cases with a Payment • Post-implementation surveys and interviews

  33. Evaluation (cont.) • Statistical findings invalid • Low number of demonstration participants • Short time period (17 months) for grant • Inability to develop assumptions and findings representative of the State • Inability to isolate impact of variables

  34. Evaluation (cont.) • Post implementation surveys and interviews provided wealth of information • Lessons learned will ensure successful rollout of Performance Dashboard in Colorado • Valuable information for other states interested in implementing a performance dashboard

  35. Lessons Learned • Training: Key to success • Two-fold • Functionality of the dashboard, especially if new technology is involved • How to use the dashboard to manage a caseload • Define clear expectations • Replacement vs. supplemental tool • Resistance to change

  36. Lessons Learned (cont.) • Value – Caseload size • Less value for workers with smaller caseloads or from smaller counties • Ten large counties in Colorado = 80% of State’s caseload = very valuable to Colorado

  37. Lessons Learned (cont.) • Support must come from the TOP down • Real-time interface is critical • More information is not always better • Ability to create personalized work lists • Identify cases reported on multiple work lists

  38. Lessons Learned (cont.) • Functionality to record notes on work lists minimizes duplicated research and allows continuous analysis at a case level • Matrix of appropriate actions for each work list / tag is helpful for less experienced workers

  39. Finale • Final grant report submitted September 30, 2010 • Statewide rollout to commence July 2011

More Related