1 / 25

ETA Data Validation July 2003

ETA Data Validation July 2003. Overall ETA Data Validation Project Goals. Develop a comprehensive, systematic data validation system to ensure data integrity across programs Increase uniformity in data definitions and data collection across similar programs

liora
Download Presentation

ETA Data Validation July 2003

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ETA Data ValidationJuly 2003

  2. Overall ETA Data ValidationProject Goals • Develop a comprehensive, systematic data validation system to ensure data integrity across programs • Increase uniformity in data definitions and data collection across similar programs • Strike the proper balance between data integrity and burden to achieve acceptable, sustainable level of error • Coordinate closely with DOL Dept. of Inspector General on methods and approach

  3. Validity and VerificationDept. of Labor Perspective • Develop reputation for reliable and accurate program data • Administration’s focus on management and accountability • Improve basis for incentives and sanctions • Basis for continuous improvement

  4. Programs Included • Unemployment Insurance Benefits and Tax (UI) • Workforce Investment Act (WIA) • Trade Adjustment Assistance (TAA and NAFTA-TAA) • Labor Exchange • Migrant and Seasonal Farm Worker Program (MSFW) • Division of Indian and Native American Programs (DINAP) • Senior Community Service Employment (SCSEP) • Office of Apprenticeship, Training, Employment, and Labor Services (OATELS)

  5. Stages of the Project • Reporting, performance and validation requirements analysis and specifications • Develop validation tools • Pilot validation methodology • Training • Technical assistance

  6. Requirements Analysisand Specifications • Requirements analysis and specifications document the reporting and performance needs of each relevant ETA program • Documentation is organized in the ETA Reporting and Performance Database • Database defines each data element and reporting specification for each report and performance item

  7. 2. Develop Validation Tools – Handbooks • Handbooks contain reporting specs and validation instructions, including acceptable source documentation • SCSEP awaiting final specs • LX has no handbook, just software and user’s guide, no case-record level data validation at this time

  8. 2. Develop Validation Tools – Software • Software completed for LX, WIA, and TAA • Software under development for MSFW and DINAP • Distribution of handbooks and software via ETA websites • LX: www.uses.doleta.gov/rptvalidation.asp • WIA and TAA: www.uses.doleta.gov/dv/

  9. 3. Pilot – State Programs • Pilot state programs – two formal state pilots • Texas – WIA • Washington State – WIA, TAA, LX • Utah and West Virginia have been trained • LX was implemented in August 2002 • Other states are testing WIA

  10. 4. Training • Regional training sessions are being held in the summer of 2003 for WIA, LX, TAA • Other programs – determine training strategy individually • Tie into national meetings • 2-3 sessions per program

  11. 5. Technical Assistance • Phone and e-mail TA available • Installing software • Building and loading extract files • Conducting report validation • Conducting data element validation • Contact information in software user’s guide and help menu of software • TA e-mail addresses • For WIA: WIATA@mathematica-mpr.com • For LX: ESTA@mathematica-mpr.com • For TAA: TAATA@mathematica-mpr.com

  12. How Data Validation Systems Improve Data Quality • Improve communication from ETA to programmers • Provide a blueprint or roadmap to understand reporting and performance measurement • Minimize burden of interpreting specifications • Provide clear standards for assessing validity • Provide detailed diagnostic data for correcting problems

  13. Report Validation • Given the data that are stored, is the software generating the correct counts • Develop an audit trail to support the numerators and denominators for each performance outcome • Classifying participant records into performance outcome groups enables non-technical staff to validate and analyze program outcomes

  14. Data Element Validation • Report will not be accurate if the data being used by the software are wrong • Requires checking data elements against source documentation to verify compliance with federal definitions • Handbooks contain instructions and examples of acceptable source documents for each data element validated • States identify state-specific source documentation to reflect the variability of state MIS systems and state/local documentation standards • Self-reported elements such as race, gender, and ethnicity are not validated

  15. Data Element Risks • Low risk data elements • Computer generated – wage records • Human input with: • Minimal judgement (e.g. dates) • Low performance impact • High risk data elements • Human input with: • Considerable judgement (interpreting rules) • High performance impact – supplemental employment data

  16. Software Selects Samples for Data Element Validation • Sampled records are displayed on automated worksheets • Participant records with positive outcomes not based on wage records are over-sampled • Software • Adjusts error rates based on weights • Produces a detailed data element validation report with error rates for each data element

  17. Data Element Validation by Program • For WIA, TAA, LX, MSFW, DINAP, and SCSEP software generates worksheets for sampled records • For WIA and TAA cluster sampling used to reduce the number of offices to be visited • For LX, no data validation against source documents — 25 cases are reviewed to ensure that file was built correctly

  18. Benefits of Performance and Analysis Software • Provides technical assistance to states • Reduces burden on local offices and small states • Clear and easy analysis of outcomes • For example, impact of zero pre-program earnings • Makes underlying performance data accessible to managers • Breaks out performance by many factors and checks for errors

  19. Software Allows forFlexible Data Analysis • Software will report by user-selected time period (weekly, monthly, quarterly, annually) • Users can also select reports by state or sub-state breakouts, including WIB, office, or case manager • Not multiple offices per participant unless state loads separate files • Software may be enhanced to allow multiple counts • Users can sort participant records by any field within performance outcome groups — will have 3 tiered sort • Users can also export participant groups for analysis, local feedback, or WRIS requests

  20. Reporting of Validation Results • Software produces • Report validation summary • Data element validation summary and analytical reports • WIA and LX software creates files with the annual report validation values for upload to ETA

  21. Visual Basic Applications • Software requires any Windows operating system • No other software required • For large files, MS SQL Server is an option if the state has a license (for UI and LX only) • Front-end edit checks ensure proper format of records

  22. Next Generation Reporting and Performance System • In Fall 2004, states may use federal software to: • Generate reports • Perform and report on data validation • Edit and transmit individual participant records • Software likely to be developed as part of new EIMS software development effort

  23. Web-Based StateInternal Audit Tool • States want capability to perform data element validation at sub-state level • Proposed design: • Software would generate samples for any level (WIB, office) upon request from authenticated user (through web) • User can complete worksheets and generate reports on-line • One sample per WIB or office per imported file • Will be able to report multiple offices per participant

  24. Benefits of Internal Audit Tool • States and federal government are dependent upon data quality at the local level • Increase the efficiency and precision of existing state monitoring efforts • Potential cost savings for the system as a whole

More Related