1 / 39

Performance Accountability Improving Data Accuracy and Reporting

Performance Accountability Improving Data Accuracy and Reporting. Washington State Web-Ex August 22, 2014. Objective. Mutual Understanding of Data Collection-Entry-Reporting accountability from Local Areas to State to DOL.

Download Presentation

Performance Accountability Improving Data Accuracy and Reporting

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Performance AccountabilityImproving Data Accuracy and Reporting Washington State Web-Ex August 22, 2014

  2. Objective • Mutual Understanding of Data Collection-Entry-Reporting accountability from Local Areas to State to DOL. • Encourage discussion of data collection and reporting requirements, procedures, and guidance. • Establish open forums for communication for technical assistance.

  3. Background • Oversight agencies like GAO and OIG cite data quality issues with ETA’s data (2002) • Guidance issued annually containing report submission deadlines and source documentation requirements • Guidance for PY12 included TEN 4-13, 8/28/13, and TEGL 28-11 for PY11/FY12 Reporting and Data Validation

  4. OIG Audit of Federal Monitoring • OIG Conducted Follow Up Audit in 2008 • One of five audit questions: Does ETA have an effective monitoring process?

  5. Policies/Procedures and Training • Data management and the resultant quality of reported data are derived from and influenced by the policies, procedures and protocols utilized at the state and/or local levels • Grantees should develop guidance for staff and sub-grantees involved in the collection of data: • Definitions of data elements • Sources of information • Participant record and documentation requirements • Procedures for collecting, entering and reporting data and associated “business rules” that cover timeliness and completeness • Procedures for entering data into an automated database • Procedures for correcting data

  6. Training and Monitoring • Data collection and data entry: • Routine training should be provided for data management guidance • All staff involved in the collection or entry of data should be trained in the procedures • The data entry process should include steps for verifying entered data against original sources on a sample basis or for entire population of records

  7. Myth: Data Management is Purely Technical Reality: In the end, data management is really about providing the best possible services Reporting and Validation are there to support effective service provision Accurate reporting is a requirement but it’s also a tool Staff should develop a holistic understanding of both data and the real participants and services the data represent

  8. Myth: Data Management is Easy Reality: Data management is HARD! Business rules are complex and multi-layered Data sets are large and hard to visualize Specs are complex and evolving Circumstances change …and sometimes the closer you look, the less clear things become, which is another challenge

  9. What’s Involved in Analyzing Results? • Review reports summarizing performance on the common measures, other federal measures, and/or project-specific measures • Talk to staff and others with first-hand knowledge of the program and its operation • Generate questions related to the logic of the program design and current environment • Develop a list of performance issues

  10. Fishbone Diagram Potential Contributing Factor Potential Contributing Factor Potential Contributing Factor High training drop-out rate Certificates not recorded Increased # of youth with multiple barriers CHALLENGE Low Performance on the Attainment of a Diploma or Certificate Rate Data Source? MIS/Reason for dropping out MIS/Part. file documentation MIS/Part. Char. In files Data Source? Monitoring reports MIS/Part. file assessments Monitoring reports Poor quality services Late project start-up Assigned to wrong svcs. Potential Contributing Factor Potential Contributing Factor Potential Contributing Factor

  11. WIA (Workforce Investment Act)effective through PY15 WIOA (Workforce Investment and Opportunity Act) effective PY16* • ‘Services’ based Participation and Exit • Data Validation required • Reporting Cohort primarily 1st to 3rd Qtr after Exit • Nine Common Measures • Reporting Participant Information • Sequence of Services – Core, Intensive, Training • ‘Services’ based Participation and Exit • Data Validation codified • Reporting Cohort extended 2nd to 4th Qtr after Exit • Twelve Primary Indicators of Performance • Expanded Reporting Participant Information • ‘Career Services’ and Training Performance Accountability*unless State ‘early implementer’

  12. Reporting Requirements • Common Measures • Aggregate Counts • Individual Records • Demographics • Outcomes • Services and Activities • Types • Dates

  13. Reporting “Most Recent” Activities • Most Recent Date Received Staff-Assisted Services • Most Recent Date Received Intensive Services • Most Recent Date Received Rapid Response Services • Most Recent Date Received Educational Achievement Services • Most Recent Date Participated in Alternative School • Most Recent Date Participated in Work Experience • Most Recent Date Received Leadership Development Opportunities

  14. Participant • An individual determined eligible to participate in the program who receives a service funded by the program in either a physical location (e.g., One-Stop Center) or remotely through electronic technologies. • Three Components • Determined eligible to participate in the program • Receives a funded service • In either a physical location or through electronic technologies

  15. Components of Participant • Individual determined eligible to participate • Depends on program/funding; doesn’t apply in the case of W-P, which is based on universal access • Receives a service • Not all services trigger participation; it’s important to understand the distinction between those that do and those that don’t • In a physical location or remotely • Many substantial services are remotely accessed; this needs to be captured

  16. Multiple Program Participation Counting Participants in Multiple Programs • Earliest date of service • Can participate in several programs simultaneously • Counted as a participant in each of those programs • The participant won’t exit from the program unless there is a gap of no service for 90 days

  17. A Service Is: Any core, intensive or training activity made available to eligible participants that allows them to benefit from specific programs in the workforce system.

  18. Services that Do Not Begin or Extend Participation • Eligibility determination • Case management administrative activities to obtain information regarding employment status, educational progress, need for additional services, etc. • Income maintenance or Support payments • Visitors to One Stop Centers, etc., for reasons other than its intended purposes • Follow-Up Services

  19. Participation Cycles and Dates of Service Although there are clear issues around exit, there are also issues around participation cycles and dates of service in general Service provision prior to formal participation Staff unclear about services that commence participation Dates of service inconsistent across file and MIS, within MIS, within file, within documents

  20. When a Service is included in Performance? • Core, intensive or training services made available to eligible participants and require significant staff involvement who exit the program. • These aforementioned individuals are included in the performance measures • Those WIA Adult and DW program participants who only receive self service or informational activities are excluded from performance

  21. Date of Exit Participant has not received a service funded by the program or funded by a partner program for 90 consecutive calendar days.

  22. Exiter • A participant who hasn’t received a program or partner-funded service for 90 consecutive days and no future services are scheduled • Three components • Hasn’t received a service • For 90 consecutive days • No future services scheduled

  23. Components of Exiter • The participant hasn’t received a service • Could be program- or partner-funded • For 90 consecutive calendar days • A gap in service can stop the 90-day clock if based on specific/allowable circumstances • No future services scheduled • Specific services and activities as allowable • Does not include any follow-up services or circumstances where the participant voluntarily withdraws or drops out of the program

  24. Extending the Exit Date • Services provided by partner programs can extend the point of exit • Participant Services provided during the initial days, prior to exit, following end of activities • Excluding Follow-Up Services

  25. When To Exit • Services and Activities should be closed when the service plan or service strategy is complete • - The service plan is a “living document,” with additions and changes possible • - Co-enrollment in different funding streams, additional partner services and a valid gap in service can extend the exit date

  26. Illustration: Participation and Exit Eligible and Receives Service End of 90-Day Period Last Service Participation None or Follow-Up Services Participation Date Exit Date

  27. Further Clarification of DATES • Participation and Exit Dates are always dates of service • Participation Date reflects first funded service • Exit Date reflects last funded service • Translation of no more ‘hard exit’ • Not intended to take responsibility away from case managers - case managers do not have to wait 90 days to begin providing follow-up services • Although federal guidance states that an exit cannot be officially recorded until that 90 days has elapsed, possible to use a ‘case closure’ MIS code or ‘exit’ form, for example.

  28. Non-Compliance with EXIT Requirements • Exit dates not reflective of dates of last service • ‘Case management’ used to extend exit date • Hard exits utilized • Date of last contact = Exit date • Date of employment = Exit date • Services provided within 90 days • Lack of common exit date (across core workforce programs) • Exit dates not consistent with dates in MIS

  29. Follow Up Services/Retention • Do NOT extend Participation • Twelve Months: Required for Youth Participants; Available for Adult and DW • Post-Employment Services to Ensure: • Entered Employment • Employment Retention • Earnings • Career Progress

  30. Follow Up Services • Follow-up begins after the expected last service • Youth are required to receive at least 12 months of follow-up services, which are triggered at exit (the only exclusion is for summer youth employment) • Not intended to take responsibility away from case managers for WIA. Case managers do not have to wait 90 days, for instance, to begin providing follow-up services.

  31. Source Documentation Whether scanned, paper, or system cross-match, the purpose of source documentation is to have an auditable trail that documents the participant, services delivered and outcomes received.

  32. Discussion • Thoughts? • Observations?

  33. WIOA Performance Accountability Overview

  34. Core Programs’ Performance Measures (except WIOA Youth) • Entered Employment • 2nd quarter after exit • Employment Retention • 4th quarter after exit • Earnings • Median earnings 2nd quarter after exit • Credential Rate • New; Up to one year after exit; Doesn’t apply to WP • In-Program Skills Gain • New; Achieving measurable skills gains, Doesn’t apply to WP • Employer Effectiveness • New; before PY16

  35. WIOA Youth Performance Measures • Placement Rate (Education, Employment) • 2nd quarter after exit • Retention (Education, Employment) • New; 4th quarter after exit • Earnings • New; Median earnings 2nd quarter after exit • Credential Rate • Up to one year after exit • In-Program Skills Gain • New; Achieving measurable skills gains • Employer Effectiveness • New; before PY16

  36. What’s Eliminated • Literacy/Numeracy indicator for youth • Although utilized in development of Skills Gain measure • Customer Satisfaction as statutory measure • State Incentive Funds • But Governor’s reserve may be used for local incentives

  37. Additional Provisions • State Targets • Must use statistical adjustment model—use now codified (Sec. 116(b)(3)(A)(viii)) • Targets for first two years included in State Plans • Additional Information required in Annual Reports • Example: Amount of funds spent on each type of service • Data Validation now codified (Sec. 116(d)(5))

  38. Additional Provisions • Sanctions • State Level • If a state fails performance, Secretaries shall provide TA (used to say will provide TA upon request) • If a state fails for 2nd consecutive year or fails to submit their Annual Report, it can lead to a reduction in statewide funds (stronger language) • Local Level • If failure continues for a 3rd consecutive year, the Governor must take corrective action which shall include development of a reorganization plan (and new local board)

  39. Questions • Final Questions or Comments

More Related