1 / 45

Using Data to Drive AWD Program Improvement

Using Data to Drive AWD Program Improvement. Sheila Thompson, AWD Coordinator C-TEC of Licking County August 1, 2014. Data and Compliance. What data do we already collect for our funding and accreditation agencies? Perkins/OBR/HEI Accreditation (ACCSC, COE)

ulmer
Download Presentation

Using Data to Drive AWD Program Improvement

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Using Data to Drive AWD Program Improvement Sheila Thompson, AWD Coordinator C-TEC of Licking County August 1, 2014

  2. Data and Compliance What data do we already collect for our funding and accreditation agencies? • Perkins/OBR/HEI • Accreditation (ACCSC, COE) • State Agencies/Boards (Public Safety, COS, Medical Board) • Financial Aid (Title IV, Pell, Loans, ODJFS, VA, WIA)

  3. Using the Data You Already Collect Which performance indicators are on target? • For Perkins/HEI/OBR? • For ACCSC/COE? • For state boards? Where are the data gaps? • By student, by program, and by CTPD What other information could inform improvement? • Advisory Committee Recommendations • Student satisfaction surveys • Stakeholder and client surveys • Instructors, Coordinators, FA and Student Services Staff

  4. Comparison of Performance Standards

  5. Perkins/HEI/OBR Data • CTPD Level Data (aggregate) • Based on State/Federal Targets, with Local Targets “negotiated” each year • Technical Skill Attainment • Credential, Certificate or Degree • Student Retention or Transfer • Student Placement • Nontraditional Participation • Nontraditional Completion

  6. ACCSC and COE Data • Program Level Data • Timelines based on program length and accreditation dates, NOT fiscal or calendar year • Retention and Placement based on Benchmarks, not Targets • Enrollment • Retention/Completion • Credentialing • Placement

  7. CTPD vs. Program Data Program Data: Why: Drives COE and ACCSC accreditation approvals Ability to collect/use: Can be difficult to collect, verify, analyze and improve over time Relevance: Most important data you can collect, analyze, and act on CTPD Data: Why: Drives Perkins/HEI/OBR performance and approvals Ability to collect/use: Easy to find, if not always timely Relevance: Difficult to analyze and act on beyond CTPD level

  8. Data and Improvement To what extent does our required data drive improvement? • Enrollment • Retention/Completion • Credentialing • Placement

  9. Enrollment Data • Drives FTEs (AWD funding) • Drives Title IV and other FA resources • Becomes the starting point for your concentrators/completers • Used to evaluate program retention rates • Used in ED Gainful Employment Disclosures

  10. Perkins/OBR Enrollment Data

  11. HEI Enrollment Data

  12. Factors Impacting Enrollment • Marketing • Program/Center Reputation • Program Calendar, Length, or Design • Enrollment Process • Prerequisites • High School Diploma/GED • Work Keys • Financial Aid Approval and Program Cost

  13. Retention and Completion Data • FUNDING (Pell, Title IV, WIA) • Subject to ACCSC, COE Accreditation Benchmarks • Program Level is more important than CTPD level

  14. Perkins Retention Data

  15. CTPD Retention Data

  16. HEI Completion Data by Area

  17. Factors Impacting Retention and Completion • Satisfactory Academic Progress (SAP) • Attendance • Grades • Student Barriers (transportation, child care, conflicting work schedules, drug and alcohol use, mental illness) • Program/Instructor Issues • Leave of Absence (LOA) • Setting the bar too low for program enrollment

  18. Credentialing Data • Perkins/HEI/OBR Performance Target • Accreditation Benchmark (COE, ACCSC) • Board/Agency Benchmark (Public Safety, State Medical Board, Ohio Dept. of Health) • Required or highly recommended for employment

  19. Perkins Technical Skill Attainment Data

  20. HEI Technical Skill Attainment Data

  21. HEI Report Work Keys Applied Math Locating Information

  22. Perkins Credentialing Data

  23. HEI Credentialing Data

  24. HEI Credentialing Data

  25. COE Credentialing Chart

  26. Credentialing: ACCSC

  27. Factors Impacting Credentialing Rates • Misalignment between curriculum, instruction, and credentialing exam • Students refusing to take the exam • Students unable to take exam (criminal background, did not meet licensure requirements) • Student lacks required academic or technical skills • Inaccurate reporting • Access to exam results

  28. Placement Data • Perkins, Accreditation, FA requirement • Used in ED Gainful Employment Disclosures • For Accreditation purposes, RELATED employment is more important than ANY OTHER PLACEMENT CATEGORY

  29. Perkins Placement Data

  30. COE Placement Chart

  31. Factors Impacting Placement Rates Student Factors: • Did not pass required credentials • Employability skills (work ethic, personal barriers) • Lack of interest in related field School Factors: • Inconsistent or incomplete reporting • Lack of emphasis on follow up and placement services Labor Market Factors: • Industry demand/current openings in related field

  32. Using the Data You Already Collect Which performance indicators are on target? • For Perkins/HEI/OBR? • For ACCSC/COE? • For state boards? Where are the data gaps? • By student, by program, and by CTPD What other information could inform improvement? • Advisory Committee Recommendations • Student satisfaction surveys • Stakeholder and client surveys • Instructors, Coordinators, FA and Student Services Staff

  33. Implications for AWD Culture • More accountability by program and instructor • Better system for collecting, analyzing and acting on data • More frequent and deeper communication within and among program staff • Increased collaboration among intake, testing, instructional, financial aid, student services, and administrative staff • Action plans (Institutional Assessment and Improvement, CIP, CCIP) that are actually used and understood

  34. Rewards for Using Data to Improve • More collaboration among staff • Fewer “findings” or “opportunities for improvement” • Improved programming and services for students • Improved relationships with external stakeholders • Move from “good” to “great”!

More Related