1 / 46

The Requirement to Report Accurate Data

The Requirement to Report Accurate Data. ETA’s Data Validation Policy for Employment and Training Programs. Overview. Background ETA’s Data Validation Policy The Need for Quality Control Creating a Workable Data Management Strategy Best Practices WIA Validation Results Questions

Download Presentation

The Requirement to Report Accurate Data

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Requirement to Report Accurate Data ETA’s Data Validation Policy for Employment and Training Programs

  2. Overview • Background • ETA’s Data Validation Policy • The Need for Quality Control • Creating a Workable Data Management Strategy • Best Practices • WIA Validation Results • Questions • Web Sources for Additional Information

  3. dation Policy alone does not ensure quality data Background

  4. Policy alone does not ensure quality data • Monitoring • Active program management • Policies and Information Systems that support data integrity

  5. Background • GPRA Section 1115(a)(6) requires Federal agency performance plans “describe the means to be used to verify and validate measured values” • “WIA Performance Outcomes Reporting Oversight,” OIG Report No. 06-02-006-03-390, September 2002

  6. Background • “Data Validation Policy for Employment and Training Programs,” TEGL No. 3-03, Change 3, July 2005 • GAO Report to Congress “Labor and States Have Taken Actions to Improve Data Quality, but Additional Steps Are Needed,” GAO-06-82, November 2005

  7. ETA’s Data Validation Policy D V

  8. Validation“Why Should We Do It?” • Data validation required by OIG and now being reviewed by GAO • Data validation is a key component in overall performance strategy • Program funding is being directly tied to reliable performance outcomes (performance budget integration) • Data validation is integrated into reporting

  9. 1 3 5 We begin with policy regarding performance objectives. The starting point is generally federal legislation. What do we want to know? What information do we need? Once we have specifications, we have to create standardized software that applies edits to the participant files and performs all of the required calculations. Summary reports and summary/analytical reports from grantees indicate the level of error in the performance outcomes and data elements. This is necessary both for feedback to states as well as to help interpret the outcomes. (Note that error levels have not yet been established.) Validation of Calculations & Data Specifications for Reports and Data Elements Standardized Software Performance Objectives Validation Reports Performance Outcomes 6 2 4 The objectives must be translated into detailed specifications outlining the operational parameters for measurement. "Data element specifications" define the fields and values required to calculate the reports, including key dates, services received and wages earned after exit. "Report specifications" define how the fields and values are used to calculate outcomes. The report specifications define which participant records are included in each numerator and denominator. We have to ensure grantees are not just performing calculations in accordance with required specifications (report validation), but that the data elements used in the calculations are accurate as well (data element validation). If grantees use the standardized software to calculate their report, they do not need to perform report validation. Data element validation requires checking data against source documentation. Performance outcomes conveying program 'success' – based on increasingly more valid data – are reported to stakeholders at various levels. The Data Reporting and Validation System (DRVS) is central to the reporting process and not just an 'after the fact' validation of reports.

  10. ETA’s Data Validation Policy • Grantees are required to annually validate data submitted to ETA to ensure accuracy • Two types of validation required: • Report (required before submitting an annual report) • Data element (required within 120 days after submitting individual records) • Failure to validate reported data is deemed “failure to report”

  11. ETA’s Data Validation Policy • Policy covers six ETA employment and training programs • Policy was rolled out in phases: • 1st phase/ 2nd phase (1st & 2nd years) –detecting and resolving issues with data and reporting systems, and compiling error rates analyzed to set accuracy standards for the 3rd year (finished) • 3rd phase (3rd year) – accuracy standards are applied to data reported by grantees (being completed now) • Data not meeting accuracy standards will be considered unacceptable for measuring performance

  12. ETA’s Data Validation Policy • Grantees will be held accountable for meeting accuracy standards • With technical assistance from ETA grantees should have developed and implemented corrective action plans • Significant or unresolved deviation from accuracy standards may be deemed “failure to report” • Grantees are encouraged to use ETA-developed handbooks, software, and guides to validate reported data • ETA will monitor State validation efforts

  13. ETA’s Support of Effort • Validation tools are evolving to meet state needs • Policies are changing/clearer based on state responses • Technical support to State Staff • Regional office • National office • Mathematica Staff

  14. State and Local Roles and Responsibilities • States had varying experiences in identifying validation assignments • Some states had no problem • Some states took time to sort through roles of different units • Some states still have not clarified assignments (particularly for TAA) • Organization of case files at local areas was often not standardized or adequate

  15. States Experiences with Data Validation • States had to determine staff to be responsible for data validation • Communication of expectations and requirements to local areas

  16. States Experiences with Data Validation • Mode of data element validation – onsite, centralized or both • Onsite validation is the recommended mode • Only Large states can ask Regional Staff for an exemption to perform centralized validation activities; the request must be made in writing to the Region office before the process is started.

  17. Various Methods for Data Element Validation • Onsite validation is essential to preserve the integrity of the process • Ideal for state staff to perform validation onsite • Promotes communication and mutual understanding with locals

  18. Various Methods for Data Element Validation • In some cases, onsite validation is impractical • Distances are too great • Small number of records • With the approval of the regional office, states can therefore pursue a combination of onsite and remote validation if necessary

  19. Detail for Reduction in Elements

  20. Schedule for Reporting of Validation Results • WIA RV was due October 1, 2005 prior to submitting the annual report • WIA and TAA data element validation will be due February 1, 2006 • LX report validation was due August 15, 2005 with the report.

  21. The Need for Quality Control "OH, OH— THE QUALITY-CONTROL QUALITY-CONTROL INSPECTORS ARE HERE."

  22. The Need for Quality Control • ETA’s policy seeks only to assess the usability of reported data for federal purposes • Grantees, and not ETA, determine the activities to be undertaken to ensure valid data are reported to ETA • ETA to offer best practices/TA • Originally, few, if any, grantees had well articulated data management strategies • Grantees should have started to formulate strategy to improve data quality

  23. The Need for Quality Control • Data management strategies for high performing organizations typically addressed four factors: • Completeness– ensuring critical data elements contain needed information • Timeliness – ensuring data are collected and reported in a timely fashion • Validity – ensuring the reported information can be substantiated or confirmed • Reliability – ensuring the reported data are trustworthy for decision-making purposes • Understanding the data flow is key to creating a workable data management strategy

  24. Source Collect Data Data Entry Quality Checks Report ETA Required Report Validation Individual Records The Need for Quality Control Typical Data Flow ETA Required Data Element Validation

  25. State Policies and Procedures MIS Systems Manuals Training LWIA Monitoring The Need for Quality Control Reviewing Data Quality

  26. The Need for Quality Control Cycle of QualityManagement Federal issuance of policy and guidance Federal monitoring and training State review of policy Issuance of guidance State review and monitoring of data Training of Local level staff Local review and monitoring Initial data entry

  27. Creating a Workable Data Management Strategy • Data collection and data entry… • Grantees should develop guidance for staff and sub-grantees involved in the collection of data • Definitions of data elements • Sources of information • Participant record and documentation requirements • Procedures for collecting, entering and reporting data and associated “business rules” that cover timeliness and completeness • Procedures for entering data into an automated database • Procedures for correcting data

  28. Creating a Workable Data Management Strategy • Data collection and data entry… • Grantees should provide routine training on the data management guidance • Grantees should require all persons involved in the collection or entry of data be trained in the procedures • The data entry process should include steps for verifying entered data against original sources on a sample basis or for entire population of records

  29. Creating a Workable Data Management Strategy • Grantees should conduct periodic quality checks… • Sometimes data are correctly transcribed from forms, but are incorrect • Recording a date of February 29, 2003, when there are only 28 days in the month • Recording a date of exit that occurs before the date of registration for the customer • Range check - reviewing recorded data to ensure legitimate data are captured • Logic check – reviewing recorded data to see if it makes sense • Edits are typically programmed into data entry screen applications • Logic checks typically involve ad hoc queries and manual file reviews

  30. Creating a Workable Data Management Strategy • Grantees should conduct periodic quality checks… • Grantees should evaluate data collection efforts by randomly observing interviews and reviewing other data collection methods to ensure procedures and instructions are followed properly • Grantees should assess the accuracy of data by verifying entered data against original sources on a sample basis or for the entire population of records • Grantees should link the data quality review process to management actions for continuous improvement

  31. Best Practices

  32. Best Practices • Creating unified record system for your local area • Indexing records • Training • Using State MIS/Policy Manuals • Using labels/tables to reflect data elements • Ensure secondary quality checks are in place

  33. Best Practices • NC File System • Cohorts: • A Adult • DW Dislocated Workers • OY Older Youth • YY Younger Youth • Categories: • E Eligibility/Intake/Application • A Employment Activities • XP Exit and Post-Program Activities

  34. Best Practices • Adult’s date of birth – AE2 • A = adult • E = eligibility/intake/application • 2 = the U.S. DOL’s reference number for date of birth • Copy of a birth certificate in the file is labeled AE2

  35. Best Practices 1 Matches or supports the data element being validated

  36. Best Practices • Integrated MIS Systems • Support validation and reporting • TN and KY are using integrated common measures-based systems

  37. Best Practices • KY Local Area compares EKOS entry screen to actual file to look for errors.

  38. Best Practices • SCVOS – has popup windows with the types of documentation allowed for each field were verification is required

  39. Summary of Best Practices • Data Quality and Validation • Continuous process • Increases performance • Insures compliance

  40. Validation Results

  41. WIA Validation • Some States had poor results compared to the rest of the Nation • This is GOOD news! • What Is The Good News? • States followed the data validation instructions and did not skew their results with false passes! • Can actually use validation to improve quality

  42. Key Areas to Focus on for QC • WIA Registration Date • Date of Exit • Documentation Activity Dates • Wages – Screenshots to archive • Vets – DD214

  43. Questions Open Discussion

  44. Review • Background • ETA’s Data Validation Policy • The Need for Quality Control • Creating a Workable Data Management Strategy • Best Practices • WIA Validation Results • Web Sources for Additional Information

  45. For More Information… ETA’s Performance and Results Website http://www.doleta.gov/Performance/ ETA’s Data Validation Handbooks and Software http://www.doleta.gov/Performance/reporting/tools_datavalidation.cfm

  46. Contact Information

More Related