1 / 37

Program and Compliance Management

Program and Compliance Management. V I R T U A L L Y. Workshop: Data Element Validation—Issues from Federal Monitoring. Outline. What’s the approach to Federal Monitoring ? What are the consistent Compliance Issues ? What are the consistent Areas of Concern ?

zarola
Download Presentation

Program and Compliance Management

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Program and Compliance Management V I R T U A L L Y Workshop: Data Element Validation—Issues from Federal Monitoring

  2. Outline What’s the approach to Federal Monitoring? What are the consistent Compliance Issues? What are the consistent Areas of Concern? How can these reviews be used as a Springboard to System Improvements? 2

  3. Background • Oversight agencies like GAO and OIG cited data quality issues with ETA’s data (2002) • TEN (Training & Employment Notice) 14-02 issued 5/28/03 announcing the Data Validation Initiative • TEGL (Training & Employment Guidance Letter) 3-03 issued 8/20/03 providing implementation guidance, describing the process, and offering tools • “…ETA will monitor the validation effort on a regular schedule.” • Guidance issued annually containing report submission deadlines and source documentation requirements 3

  4. Federal Monitoring of Data Validation 4

  5. In General . . . Most regions began monitoring DV after common measures were implemented Compliance ”playbook” Program Reporting Instructions DV policy guidance and handbooks TEGL 17-05 Overall objective is data quality (data are accurate and reliable) 5

  6. Review Approach Where Review Takes Place Traditional On-Site Review Remote Review Combination Focus of the Review Program Reporting and DV DV as component of comprehensive review Structure of Review Team Varies based on regional protocols 6

  7. Review Scope Programs Workforce Investment Act (WIA ) Trade (TAA) LX (Wagner-Peyser/VETS) Validation Cycles Most recently completed cycle Additional cycle sometimes included Components Report Validation Data Element Validation Process Evaluation 7

  8. Components of ReviewAssessment of Report Validation (RV) For WIA and LX only – not TAA RV focuses on aggregate calculations Did state validate their annual data submission (WIA and LX)? How were ‘high’ error rates addressed? Generally completed up front (e.g., before site visit) 8

  9. Components of ReviewAssessment of Data Element Validation (DEV) DEV = record review = case file review Required across core workforce programs and focuses on data that are used for calculations Federal staff revalidate sub-sample of state’s sample Random Sample for WIA and TAA All 25 records for LX Review data element error rates What has the state done with the information? 9

  10. Components of ReviewProcess Evaluation Data management and the resultant quality of reported data are derived from and influenced by the policies, procedures and protocols utilized at the state and/or local levels Review of Reporting and Validation Process (including MIS) Review of Policies, Procedures, Protocols (including training) and how deployed 10

  11. Review Approach and ScopeIn the Dallas Region (Region 4) Traditional on-site review (one full week) that includes local office visit Remote review only if site visit not possible Scope consists of core workforce programs Two most recent cycles unless prior cycle reviewed “…the organization, data collection, report preparation and data validation work activities of the state and how these activities are managed on a daily basis.” Joint ETA/VETS Review Team Also includes formula FPO and others 11

  12. Process evaluation component begins as soon as review is scheduled and addresses objectives covered in the Core Monitoring Guide and Supplements Review results inform subsequent program reviews All eleven states have been reviewed Second round of reviews will commence in FY 2013 Review Approach and ScopeIn the Dallas Region (Region 4) 12

  13. Clarifying Accuracy Standards • Across the core workforce programs, ETA has two published error rate thresholds • (LX) No errors allowable for LX DEV (0% errors in the minimum 25-record sample) • (WIA) States submitting annual reports with RV errors that exceed 2% will not be eligible for WIA incentives for that PY (TEGL 9-07, dated 10/10/07) • We have a provisionalerror ratethreshold of 5% for WIA and TAA data elements • Error rates exceeding 5% as “Area of Concern” only 13

  14. Consistent “Findings” Across Regions 14

  15. Non-Compliance with EXIT Requirements Exit dates not reflective of dates of last service Gaps of service spanning years Case management used to extend exit date Hard exits utilized Date of last contact = Exit date Date of employment = Exit date Services provided within 90 days Exit dates not consistent with dates in MIS 15

  16. Participation Cycles and Dates of Service Although there are clear issues around exit, there are also issues around participation cycles and dates of service in general Service provision prior to formal participation Staff unclear about services that commence participation Dates of service inconsistent across file and MIS, within MIS, within file, within documents 16

  17. Incorrect Capture/Coding of Required Data Elements Several data elements routinely cited across reviews Ethnicity and Race Disability Status Veteran Status UI Claimant Status UC Eligible Status School Status at Participation Needy Family Status Dislocation Date Example: Ethnicity and race combined (in MIS, on forms, etc.) No response for ethnicity interpreted as a negative response (MIS default) 17

  18. Wage Records Not Available for Validation Two Primary Issues Not accessible due to privacy concerns Data not frozen or archived 18

  19. Wage Records Not AvailableAccess to the Data • WRIS Data-Sharing Agreement specifies: • Info can be shared with “…auditors who are public employees seeking access to the information in the performance of their official duties.” • “The PACIA shall permit ETA … to make onsite inspections… for … conducting program audits…” 19

  20. Wage Records Not Available Data Not Frozen or Archived DV is a point-in-time activity but wage records are dynamic information Federal policy requires the data be frozen or archived to allow federal reviewers follow the same audit trail as state validators Federal reviewers cannot utilize a live database Because the data have not been kept, there is no audit trail – also a finding (record retention) 20

  21. Record Retention Two Primary Issues: Participant files missing, cannot be located, or documents missing Wage records and validation extract files purged or just not kept 21

  22. Record Retention (2) Wage Record Data Example: Data periodically purged from state system with DV extract files and documentation “Data validation results and documentation should be retained for at least three years after completion. Retention methods are at the state’s discretion and may include an archived database, printed worksheets and reports, or other methods.” (DRVS Handbook, which can be accessed at www.doleta.gov/performance) 22

  23. Source DocumentationIncorrect and/or Lacking Not using most recent guidance (sources changed) Using MIS as allowable source (checkmark or radio button is insufficient by itself) Lack of documentation to support service provision (e.g., youth follow up services) 23

  24. Source DocumentationIncorrect and/or Lacking (2) Policy includes incorrect sources or combines eligibility documentation with DV source documentation, resulting in incorrect DV sources Self-attestation forms signed by case manager, not participant Using internal form as “cross-match” with another database 24

  25. Veterans’ Services Issues Coding Errors Individuals not counted as veterans but receiving veterans’ services (from LVER/DVOP) If individual is a vet, other veteran-related data elements are required but State MIS either allows no response or defaults to no Priority of Service Signage/Policy indicating 180 days of active duty service is needed to be eligible for POS For priority purposes, only one day of active duty service is needed 25

  26. Consistent “Areas of Concern” 26

  27. DEV Error Rates > 5% “High” error rates can only be noted as an Area of Concern Applies to WIA, TAA or both In some cases, high error rates continue across validation cycles for same data elements What has the state done to eliminate or minimize errors? 27

  28. DV Results Not Utilized In some instances, results not sharedor are only noted during state’s exit conference Conducting the required validation for compliance purposes only is a missed opportunity (at best) If the intent is data quality, then utilizing the results of annual validation for continuous improvement purposes just makes sense 28

  29. Quality of Participant Files Lack of consistent file structure Documents missing Case notes missing, skimpy, illegible, irrelevant MIS contains different information One file for multiple periods of participation for same individual 29

  30. Need for Policies/Procedures Just to report basic metrics, participant data goes through many steps—gathering, data entry, case management, extract process, data validation, etc. Strong operating policies and procedures are essential at each step to maximize accuracy and minimize confusion Active management of the data (state and local levels) is necessary Note: Reviews have also highlighted need for required policies (e.g., youth needing additional assistance) 30

  31. DV Review as Springboard to Needed Improvements 31

  32. Myth: Data Management is Purely Technical Reality: In the end, data management is really about providing the best possible services Reporting and Validation are there to support effective service provision Accurate reporting is a requirement but it’s also a tool Data represents actual participants and their experiences with our system! 32

  33. Myth: Data Management is Easy Reality: Data management is HARD! Business rules are complex and multi-layered Data sets are large and hard to visualize Specs are complex and evolving (!@#$%) Circumstances change …and sometimes the closer you look, the less clear things become, which is another challenge 33

  34. Enter: DV Reviews Because these reviews look at the entire system from the bottom up (e.g., how data are collected) and the top down (e.g., state policies), Program Reporting and Data Validation Reviews can identify system weakness and opportunities for improvement. 34

  35. Got Gaps? Data Validation Reviews can – Clarify federal policy and expectations Identify policy and procedural gaps Facilitate needed MIS changes, edit checks, adjustments Demonstrate the need for training Highlight ways to improve “regular” monitoring And more! 35

  36. Although no one can predict the future with certainty, data quality will remain a key factor in decision-making, including funding decisions Questions? 36

  37. Resources • Most recent source documentation for WIA data validation is part of TEGL 28-11 • http://wdr.doleta.gov/directives/attach/TEGL/TEGL_28-11-Atta1.pdf • Reporting tutorials and user guides are located on ETA’s performance website • www.doleta.gov/performance • TEGL 9-07 (2% RV error rate threshold) • http://wdr.doleta.gov/directives/attach/TEGL09-07acc.pdf • All guidance letters are posted • http://wdr.doleta.gov/directives

More Related