1 / 45

DEI Grantee Conference DEI Evaluation March 12, 2013

Social Dynamics, LLC Bringing Complex issues into Focus. DEI Grantee Conference DEI Evaluation March 12, 2013. In partnership with Altarum Institute Berkeley Policy Associates Mathematica Policy Research. Overview of the Session. Welcome & Introductions ETA/ODEP Partnership

earlm
Download Presentation

DEI Grantee Conference DEI Evaluation March 12, 2013

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Social Dynamics, LLC Bringing Complex issues into Focus DEI Grantee Conference DEI Evaluation March 12, 2013 In partnership with Altarum Institute Berkeley Policy Associates Mathematica Policy Research

  2. Overview of the Session • Welcome & Introductions • ETA/ODEP Partnership • DEI Evaluation Team • Grantees’ Direct Link to the DEI Evaluation:State Evaluation Liaisons • Overview of the DEI Evaluation Design • Six Main Research Questions of the DEI Evaluation • Partnership & Collaboration Objectives of the DEI Evaluation Team

  3. Overview of the Session (continued) • DEI Evaluation Synthesis Reports • What we Learned • DEI Data Quality • What do we mean by data quality? • Why is it important? • Systems Change Component of the Evaluation • What is it? • How will it be used? • SCCS Structure

  4. Welcome & Introductions ODEP/ETA Partnership • ODEP—Division of Policy Planning & Research • Dr. Richard Horne, Supervisory Research Analyst • Dr. Janet Voight, Contracting Officer’s Representative • ETA—Disabilities Program • Randee Chafkin, M.A., Senior Program Specialist • ETA Regional FPOs

  5. DEI Evaluation Team ODEP COR J. Voight ODEP R. Horne Subject Matter Experts D. Stapleton G. Livermore Project Director D. Klayman TA & Grantee Communications SD Site Visit Coordination SD Qualitative Analysis BPA R.A. & LWIB Selection Mathematica Statistical Analysis SD Data System Altarum

  6. Grantees’ Direct Link to the DEI Evaluation:State Evaluation Liaisons • Each grantee has a State Evaluation Liaison who will: • Respond promptly to DEI Evaluation-related questions from State, LWIB, and American Job Center personnel; • Ensure smooth implementation of the DEI Data System at the American Job Center level; and • Coordinate, design, and schedule technical assistance related to the DEI Evaluation (e.g., Data System site visits, webinars, conference calls, etc.).

  7. DEI Evaluation Liaisons Kim Barlow, M.P.H. KBarlow@socialdynamicsllc.com IL, KS, NJ AR, IN, LA, MN Dr. Robert Bartolotta RBartolotta@socialdynamicsllc.com Danielle Herbert, M.A. DHerbert@socialdynamicsllc.com Dr. Becca Jackson Rjackson@socialdynamicsllc.com Dr. Susan Labin Slabin@socialdynamicsllc.com Dr. Robert Bleimann RBleimann@socialdynamicsllc.com HI, WA FL, MA, RI AK, OH, TN, SD Dr. Douglas Klayman dklayman@socialdynamicsllc.com CA, WI IA DE, ME, NY, VA Marilia Mochel, M.A. MMochel@socialdynamicsllc.com 1-855-990-1105 (toll free)

  8. The DEI Evaluation is a comprehensive implementation/process, outcome, and impact study. • Implementation: Collection of information on systems change and the strategies implemented by each of the DEI grantees (site visits/telephone interviews). • Outcomes: Focus on customer-level outcomes based on administrative data (WIASRD, W-P, DEI Data System). • Impact: Focus on comparing DEI pilot sites with comparison sites (WIASRD, W-P, DEI Data System). The DEI Evaluation Design

  9. The DEI Evaluation Design: Random Assignment • Random assignment ensures that the DEI pilot and comparison sites are equivalent in terms of all extraneous variables. In other words, the characteristics of LWIAs in both groups are the same. The comparison group serves to illustrate: What would have happened in an equivalent group of LWIAs in the absence of the DEI?

  10. The DEI Evaluation Design (continued) • Site Visits: Years 1 and 3 to discuss start-up and implementation issues. (Component: Implementation) • Mid-Term Telephone Interviews: Year 2 interviews with DEI State Leads to learn how grantees use the selected strategies to improve their public Workforce Development Systems and better serve PWDs. (Component: Implementation) • WIASRD, Wagner-Peyser, and the DEI Data System: Customer outcome data will be sent quarterly to the Evaluation Team. (Component: Outcome & Impact)

  11. What start-up issues were identified? How were these issues resolved?Sources: Site visits and telephone interviews (Component: Implementation) • What are the impacts of the DEI grants on customer utilization and disability employment outcomes such as employment, earnings, and employment retention? • Source: Administrative data—WIASRD/W-P/DEI Data System (Components: • Customer Outcomes & Impact) • 3. What services and strategies appear to be the most successful in helping customers achieve employment outcomes?Sources: Site visits, telephone interviews, and administrative data • (Component: Implementation) Six Main Research Questions

  12. What are the challenges that need to be addressed to improve the employment outcomes of PWDs?Sources: Site visits and telephone interviews (Component: Implementation) • What systems changes need to be made to overcome these challenges?Sources: Site visits and telephone interviews • (Components: Implementation & System Change Outcomes) • What best practices have the pilot sites identified that address these systems changes?Sources: Site visits and telephone interviews (Component: Implementation) Six Main Research Questions (continued)

  13. Partnership & Collaboration Objectives of the DEI Evaluation Team • The DEI Evaluation is a partnership between DOL, the DEI grantees, and the Evaluation Team. There are five partnership objectives: • To support the collection of pilot and comparison DEI Evaluation data in LWIAs and American Job Centers through telephone, on-site, and webinar technical assistance. • To provide ongoing support for implementing the DEI Evaluation and using results for program monitoring and improvement.

  14. Partnership & Collaboration Objectives (continued) • 3. To involve grantees and stakeholders in the evaluation process and use their input to inform the analysis of findings related to program context and systems change. • 4. To provide annual presentations on DEI Evaluation results (when available) to inform local planning activities. • 5. To maintain open lines of communication throughout the program period.

  15. DEI Evaluation Synthesis Report(s) • What is the Annual Synthesis Report? • Informs ODEP and ETA of progress made to-date on DEI goals; • Provides useful information on DEI start-up (on both the program and the evaluation side); and • Documents emerging systems changes.

  16. Rounds 1 & 2 Grantees: Moving Forward • Numerous R1 and R2 grantees have made significant strides in creating awareness of the needs of CWDs, developing their network of partners, and extending training opportunities statewide. • A DRC in ME provides training to executive-level LWIB staff, which is described as “revolutionary” by those who attend. “It gave us, for the first time, a comprehensive understanding of what we need to do to create better access to services for customers with disabilities.” • To promote the sustainability of the DEI, IL DRCs train American Job Center staff on the availability of benefits and how to handle TTW customers. What we Learned (Examples from Years 2–3 R1 & R2 States)

  17. What we Learned (continued) • Rounds 1 & 2 Grantees: Moving Forward – continued • In NJ, one LWIA distributes DEI resources across four different people who play the DRC role. This initiative increased the availability and visibility of the DRC position and is expected to improve the knowledge of American Job Center staff through informal and formal training opportunities. • DRCs in NY are now all CWIC-trained! • Several states, including DE, ME, SD, and TN, have created customer-focused IRTs that bring multiple agency partners to the table (in-person) on a customer-by-customer basis. This arrangement has led to the braiding of funds that increase the availability of programs and services.

  18. What we Learned (continued) • Rounds 1 & 2 Grantees: Moving Forward – continued • TN uses a statewide resource mapping initiative to identify available resources (and the need for new resources) in order to build the capacity for increasing access and availability to programs and services for CWDs. • An LWIA in NJ developed a comprehensive, customer-driven monitoring process to ensure that American Job Center CWDs help to shape the kinds of trainings, programs, and services the American Job Center offers.

  19. What we Learned (continued) • Variation in Implementation • The DEI requires that grantees implement specific strategies and infrastructure arrangements, but provides few requirements as to how exactly this implementation is to happen. This approach gives the grantees the flexibility to build on existing strategies and to explore new and innovative strategies in serving PWDs and, in year 2, gave us and DOL valuable information about promising practices in this regard. • Because of this flexibility, we expected to—and did—find a great deal of variation across the grantees. • DRCs may either work at the American Job Center/consumer (local) level or at the LWIA (system) level. In most cases, the State defines the role for the DRC, but how this role is operationalized varies (often by LWIA). • Stark differences in the DRC role occur in the same state and in contiguous LWIAs.

  20. What we Learned (continued) Variation in Implementation – continued I. Local-level (American Job Center/consumer) DRC responsibilities vary. There are three types: a. Focuses exclusively on TTW. b. Works only with CWDs. c. Plays support (helps other staff) and employment counselor role. • Best Practice of Local-Level DRCs • Provide informal training and support to increase the overall level of knowledge of CWDs in American Job Center staff rather than being the “the funnel” for CWDs who enter the American Job Center.

  21. What we Learned (continued) • Variation in Implementation - continued • II. System-level DRCs spend most of their time overseeing the work of American Job Center employment counselors, helping them to identify key training and benefits opportunities for CWDs. They are more likely than local-level DRCs to offer formal training to American Job Center staff and network outside of the American Job Center with the LWIB and other local systems (e.g., education, juvenile justice, employer community, etc.) • Best Practice of System-Level DRCs • DRC implementation to increase the knowledge of staff who have the skills to work with CWDs (AK). Relatedly, some states utilize supervisory (LWIA-level) DRCs (IL, ME, VA) whose job it is to oversee local DRCs and/or American Job Center employment counselors to build capacity.

  22. What we Learned (continued) • Examples of Variation in Implementation • There is also substantial variation in how DEI strategies are implemented at the LWIA level. • Integrated Resource Teams (IRTs) are customer-centric, and meet in-person with the DRC, agency/organizational team members, customer, and (if youth) parents/guardians. • IRTs are focused on problem/resolution. DRCs separately contact “the individuals that are important to what we’re trying to do in terms of funding.” This approach focuses on the problem at hand (e.g., customer needs certificate-based training, customer needs GED to get a job). This approach does not provide an opportunity to brainstorm ideas with customer and partners and involves less customer input. • Customized employment is defined very generally as a way to fit a job around an individual. We have not seen the more intensive approach that includes Discovery and Negotiation in the states that selected this strategy.

  23. What we Learned (continued) • Examples of Variation in Implementation - continued • Asset Development is being defined broadly as both informal discussions and formal trainings on savings accounts, banking, and financial literacy. LWIAs do not always use a specific curriculum, but more likely incorporate modules into more general trainings on job readiness. • The Guideposts for Success are used informally by youth states as a general structure for youth services. At this point, there is no evidence that they have been incorporated into youth programming. • Disability disclosure remains a challenge. In some LWIAs, DRCs are coaching employment counselors to try to obtain information on customers’ disabilities in order to help them get certain services, while in other states DRCs and employment counselors rely on self-disclosure only. This type of variability may affect the number of American Job Center-enrolled CWDs.

  24. What we Learned (continued) • Concerns of R1 & R2 States • Some American Job Centers are concerned that their EN status is being usurped by the State EN, as funds collected by American Job Centers ENs go to the State. • In a number of states, existing providers that are ENs are unhappy with the new EN status of their local American Job Centers. • This arrangement puts the system into a competitive mix. • CWIC loss mentioned as a deficit by several LWIA staff. • In youth states, permission to speak with a customer under the age of 18 requires parental permission, which complicates the process of engaging youth in Workforce Development services.

  25. What we Learned (continued) • Concerns of R1 & R2 States – continued • Some youth states have had difficulty placing formerly incarcerated youth and have spent considerable time working with youth to have their criminal records expunged. • Several American Job Center staff expressed concerns about the lack of awareness of mental disabilities.“Employers don’t seem to think that mental health issues are important. They don’t think of these customers as having a disability, which creates difficulties when symptoms affect behavior.”

  26. Data Quality What is Data Quality? • The degree to which a dataset can provide accurate and reliable information. • Content quality identifies erroneous information (e.g., missing, out-of-range entries). • Predictive quality measures the extent to which a dataset can accurately estimate population parameters.

  27. Data Quality (continued) Why is Data Quality Important? • Better position to accurately measure the impact of the DEI. • Ensures we can produce analyses on important subgroups (e.g., TTW customers). • Ensures that your work is accurately represented in annual and final reports.

  28. Data Quality (continued) What Improves Data Quality? • Consistency in how each Data Element is defined. • Is the perceived barriers question asked the same way by each grantee? • Completion of all required Data Elements. • Are all Data Elements being completed by staff at each American Job Center? • Consistency in American Job Center operations. • Do all LWIA American Job Centers ask the “disability question”? • Is data collected from customers at approximately the same time intervals? • Is the eligibility criteria for certain services approximately the same across American Job Centers?

  29. Data Quality (continued) How can the DEI Evaluation Team help with Data Quality? • Careful review of all DEI grantee American Job Centers to ensure that they collect the required DEI Data Elements. • Provide Data Quality Reports that pinpoint where errors occur. • Quarterly Data Quality Reports sent to DEI State Leads. • Webinars for State, LWIB, and American Job Center staff on DEI data collection requirements. • Issue Briefs on data quality and data collection requirements. • Toll-Free Telephone Technical Assistance available to DEI State Leads, State IT staff, and American Job Center staff.

  30. Percentage of customers expected to be outside the upper tolerance zone of 4%. 4 4% 3+-1 percentage points 3 Tolerance Band 2% 0 Data Quality (continued) Data Quality and Statistical Control The DEI Evaluation Team will monitor errors rates using an error tolerance band of 3% with +/-1 percentage tolerance; an acceptable error rate for DEI data is 4%. The objective is to determine what proportion of records have mean error rates above 4% (upper-specification limit), and to correct, isolate, and resolve these errors. To determine the proportion of case records that have a mean error rate above 4% (upper-specification limit): Z = (8 - 5) / S.D. 2.78 = 1.08 or 8% of case records have errors. 2

  31. Percent of Error Ordered by Type of Error ( Pareto Distribution) Cumulative Distribution/ Pareto Curve Cases with Missing, Duplicate, 100 and/or Invalid SSNs 30% of N 90 LWIB1 80 60 60 Percent of Total Cases Missing SSN Percent of Total Error Duplicate SSN LWIB2 40 40 Missing Dis status LWIB3 Invalid SSN LWIB4 20 20 LWIB 5.......k 0 0 Percent of Cases by Type of Error Data Quality (continued) Data Quality and Statistical Control - continued Pareto analysis can be used to detect the most frequently occurring sources (or vital causes) of error and the data reporters that have the highest error rates (vital cases). Pareto analysis does this by arranging the dataset according to priority of importance. Where should we be spending our time?

  32. Data Quality (continued) What can we do to Maximize Data Quality? • We will contact you quarterly to report on any data collection issues. • We will provide you with LWIA-level information on, for example: • Number/% records with complete WIASRD, W-P, and DEI Data Element data for all customers, and for CWDs only. • Specific variables or Elements that are not being completed regularly. • If issues arise, we can participate in conference calls with particular LWIAs or American Job Centers.

  33. Data Quality (continued) What can you do to Maximize Data Quality? In addition to collaboration between the Evaluation Team, State Leads, and IT staff, several states have taken additional steps to help ensure data quality. For example: • NY had a webinar to introduce staff to the mechanism for collecting DEI Data Elements. During this call, the importance of complete data was emphasized. • VA permanently adopted all of the DEI Data Elements into their everyday common intake form. • WA is assigning specific data entry duties to Americorpsvolunteers, who will be specially trained to enter DEI data.

  34. Systems Change & the DEI Evaluation • ODEP and ETA have extensive experience working on systems change issues. • Social Dynamics adapted systems change approaches from ODEP/ETA to create a DEI systems change coding scheme and rubric. • What do we mean by “systems?” • Systems are aligned networks of organizations, individuals, processes, and relationships that are grounded in values and principles that provide PWDs with access to services and supports. • Systemsinvolve collaboration across agencies, employers, Workforce Development staff, others groups, and individuals for the purpose of improving access to services and employment and expanding the array of services and supports. Handout

  35. Systems Change & the DEI Evaluation (continued) • Functional systems have/create the necessary context, structure, and expertise to achieve their goals. They: • Adapt/change according to social, political, economic, and organizational circumstances; • Harness the different functional areas and expertise of its members; and • Work toward a common goal of providing integrated services to PWDs that lead to more effective employment policies and services for PWDs.

  36. Systems Change & the DEI Evaluation (continued) GeneralIndicators Social and political context that leads to resources that Provide for the design and implementation of effective strategies and Create better linkages among stakeholders across the Workforce Development System, which leads to Subsystems and processes that facilitate continuous improvement and monitoring So that the system can produce broad impacts on PWDs. Adapted from Coffman, 2007

  37. Systems Change & the DEI Evaluation (continued) General Questions 1. Has the DEI changed the social and political context of LWIAs? 2. Has the DEI produced changes in the way resources are used to enhance policies or practices that will improve system components, capacity, and scale? 1. Did the DEI grantees implement selected strategies as intended? 2. Did these strategies produce their intended outcomes for PWDs? 1. Did the DEI create linkages across the system that did not exist prior to implementation that improved access to needed services/ resources? 2. Did the DEI enhance existing relationships that improved access to needed services/ resources? 3. Did these linkages produce their intended outcomes on the system? 1. Did the DEI produce enhancements in the capacity of participating LWIAs to meet the needs of PWDs? 2. Did the enhancements meet sustainability and quality objectives? 1. Did the DEI lead to better access to and availability of services and resources for PWDs? 2. Did the DEI result in the intended impacts for PWDs at a system-wide population level? Adapted from Coffman, 2007

  38. The SCCS Development Process • Clients (ODEP and ETA)have extensive experience working on systems change procedures. • Customized Employment (ODEP) • Self-Employment for People with Disabilities (ODEP) • Disability Workforce Development System TA (ODEP) • Disability Program Navigator Initiative (ETA) • Rather than reinventing the wheel, Social Dynamics adapted ODEP’s existing systems change framework so that it aligned with the DEI.

  39. The ODEP Systems Change Framework ODEP’s 9 System Change Domains 1. Capacity 2. Coordination/Integration 3. Customer Choice 4. Employer Outreach 5. Employer Partnerships 6. Employer Support 7. Development or Adaptations of Existing Practices 8. Dissemination of Effective Practices 9. Sustainability

  40. Additional SCCS Domains • System Boundaries — defined by the members and the relationships that perpetuate, challenge, and/or enhance system operations (Midgley, 2007). Defines who the members of the system are and their respective roles. • System Density — the proportion of an entity’s (organization or individual) ties (or partnerships) to other members of the system out of the total number of ties across the system (Barabasi, 2002). An indicator of the amount of collaboration across the system.

  41. Additional SCCS Domains (continued) • System Centrality — the number of direct connections to each member of the system. Entities with high centrality are typically the leaders and facilitators of the system (Borgatti et al., 2009). Helps to identify the leaders of the system and their respective roles. Why are they the leaders? What do they do to enhance system effectiveness? Leaders may include organizations/agencies or individuals.

  42. SCCS Structure • SSCS is used to code qualitative data from interviews and focus groups with members of LWIA systems. Inter-rater reliability is obtain in two ways: • Team consensus is achieved through site visit team discussions. • Each team member completes the SCCS using the data from each site visit. A score is given by each team member using a rubric.

  43. SCCS Structure (continued)

  44. SCCS Structure (continued) 2. Weighted Kappa (linear weights for ordinal data). • Degree of agreement between two site visitors who classify LWIAs’ “system” ratings into one of four ordinal groups at baseline and follow-up. • Weight (w1,2) quantifies the degree of concordance between the ordinal groups 1 & 2, 1 & 3, and 1 & 4. W12 = 1.0 W12 = .75 W12 = .50 W12 = .00

  45. Social Dynamics, LLC 481 North Frederick Ave., Suite 410 Gaithersburg, MD 20877 Phone: (301) 990-1105 www.socialdynamicsllc.com To Contact the DEI Evaluation Team: Douglas Klayman, Ph.D. Project Director dklayman@socialdynamicsllc.com301-990-1105 x 105 Toll free: 1-855-990-1105

More Related