1 / 96

Part II

Part II. The details. MIS Data Data Elements Funding Accountability. Core Indicator Data. Defining the Data. SAM Codes TOP Codes Data Elements Core Indicators “The Law” Definitions Negotiated Performance Targets Measurement Approaches/Formulas. Funding.

sileas
Download Presentation

Part II

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Part II The details

  2. MIS Data Data Elements Funding Accountability Core Indicator Data

  3. Defining the Data • SAM Codes • TOP Codes • Data Elements • Core Indicators • “The Law” • Definitions • Negotiated Performance Targets • Measurement Approaches/Formulas Funding

  4. Student Accountability Model (SAM) & Taxonomy of Programs (TOP) • Priority “A“ - Apprenticeship • Must have the of the Division of Apprenticeship Standards approval • Priority “B“ – Advanced Vocational • Used sparingly, no more than two courses in any one program • “B” level courses must have a “C” prerequisite in the same program area • Priority "C" – Clearly Occupational • Generally taken in the middle stage of a program, detracts "drop-ins."

  5. Student Accountability Model (SAM) & Taxonomy of Programs (TOP), Continued • Priority "D" – Possibly Occupational • Taken by students in the beginning stages of their occupational programs • Can be survey course • Priority “E” = Non-Occupational Vocational Flag on TOP code • Designed to identify vocational “Programs” for federal reporting (*) - see Taxonomy of Programs, Sixth Edition, July 2007

  6. Data ElementsMIS System • Students, Courses, Degrees, Services • Student VTEA Data Elements • Economically Disadvantaged • Single Parent • Displaced Homemaker • Cooperative Work Experience Education • Tech Prep • Migrant Worker - Implementation in MIS SU 09

  7. Accountability Requirements Section 113(b) 5 core indicators of performance: • Student attainment of technical skill proficiencies; • Student attainment of credential, certificate, or degree; • Student retention in postsecondary education or transfer; • Student placement in military, apprenticeship, or employment • Student participation/completion of non-traditional training State and Local adjusted levels of performance • Levels of performance negotiated with USDE / State Results reported annually

  8. Perkins IV (2006) Core Indicators • Technical Skill AttainmentSuccessful CTE course completion (GPA) • CompletionsProgram completion–Certificate, Degree & Transfer Prepared • Persistence & TransferStudent persistence in Higher Ed • PlacementPlacement in apprenticeship, employment, military, fed gov • Equity -- Nontraditional FieldsParticipation (5a)/Completion (5b) - nontraditional “fields”

  9. Cohort Definitions Used for Measurement Participant: NT Participation • Perkins III -Any enrollment in a CTE course (TOP) • Perkins IV – concentrator using assigned major Concentrator: All other indicators • Cohort of students enrolled during the cohort year and • Successfully completed at least one course in the middle or end of a program (SAM A-C) and 12 vocational units within a single discipline (two digit TOP) or • Program completion as indicated by receipt of ANY vocational credit certificate or degree Leavers: Not enrolled in year following cohort year • 2P1 - Completions • 4P1 - Placement (Employment)

  10. Assigning a Program to a Student • Award –TOP code of CTE Certificate or Degree • Concentrators • Hierarchy based on SAM Priority code • Assigned to the TOP where most CTE units occurred

  11. Timeline for Outcomes & Outputs Negotiated Performance 2008-09 Negotiated Spring 2008 Reports publish in Spring 2009 Cohort Year (2006-07) +1 yr for outcomes (2007-08) Transfer Persistence Employment Outcomes have already occurred Target low performance now!

  12. Timeline for Outcomes & Outputs

  13. Core Indicator 1Technical Skill Attainment • All Concentrators • Successful Course Completions • Technical Skill Proficiencies • Vocational (CTE) Courses • SAM A-C • Vocational TOP • G.P.A. • Grade reports (moved to Data Mart)

  14. Core Indicator 1: Measurement &Performance Levels SAM A - C Courses: # Student concentrators with GPA > 2.00 ÷ # Students concentrators with Grades A – F Excludes students with only CR/NC or P/NP grades in SAM A-C courses Plan Year Target Actual* 2005-06 93.65% 2006-07 92.58% 2007-08 92.46% 2008-09 92.46% 92.xx% 2009-10 92.yy% % * Based on spring 2008 Perkins IV reports.

  15. CI 1 - 2008 Data

  16. CI 1 - 2008 Data - Student counts

  17. CI 1 - 2008 Data

  18. Fake DataExaggerated 93.67%

  19. Fake DataExaggerated 92.5% 84.3% 83.2%

  20. Forecasting Wide range of forecasting techniques Qualitative Forecasting Methods Judgmental Forecasting Expert Forecasting Consensus Forecasting Informal Quantitative Forecasting Methods Time Series Naïve Forecasting Averaging Causal / Relational Models assume cause and effect, and cause can be used to predict outcomes if you know one variable, you can forecast the other Sophisticated

  21. Qualitative Forecasting Methods Judgmental Forecasting Expert Forecasting Consensus Forecasting Informal Work best when background conditions are changing rapidly When economic, political or administrative conditions are in flux, When quantitative methods may not capture important information about factors that are likely to alter historical patterns. (e.g., new large apprenticeship program)

  22. Qualitative Forecasting Weaknesses anchoring events allowing recent events to influence perceptions about future events, e.g. the college hosting a recent institute on student learning information availability over-weighting the use of readily available information false correlation incorporating information about factors that are assumed to influence outcomes, but do not inconsistency in methods and judgments forecasters using different strategies over time to make their judgments, making them less reliable selective perceptions ignoring important information that conflicts with the forecaster’s view about causal relationships wishful thinking giving undue weight to what forecasters and government officials would like to see happen group think when the dynamics of forming a consensus leads individuals to reinforce each other’s views rather than maintaining independent judgments political pressure where forecasters adjust estimates to meet the imperatives of budgetary or other college constraints.

  23. Simple Quantitative Naïve Forecasting Random Walk Last known Random Walk with drift Averages Seasonal adjustments used in expert forecasting as the starting point for estimates that are then adjusted mentally

  24. Random Walk Last known

  25. Last known Random Walk

  26. Random Walk with drift

  27. Random Walk with drift

  28. Averages: CI 1 - 2008 Data Random walk 92.34%Random walk w drift 92.46%

  29. Moving Average moving average the last N periods of data are used equally all prior observations are not used  Provided in the workbook

  30. Averages: CI 1 - 2008 Data 92.6%

  31. What we used on CI 1 - 2008 Data 92.46%

  32. Three Basic “Chuck” Rules With no application of local knowledge or sophisticated projections: Declining for three or more years random walk, last known Increasing for three or more years three year average Increasing and decreasing three year average.

  33. Core Indicator 2Program Completions • Completers (numerator) • Transfer Prepared • Award in Current Year • AA/AS degrees • Certificates • Award in subsequent year with no Voc coursework • or Equivalent • Leavers & Completers (denominator) • Left system (college) for one year and/or • Award in Current Year • AA/AS degrees • Certificates • Transfer Prepared • Award in subsequent year with no Voc coursework • Removed Persisters & Life-Long-Learners

  34. CI 2-Completions: Measurement & Performance Levels Certificate/Degree/Transfer Prepared÷ Concentrators (Leavers & Completers) Not LLL * Based on Perkins IV data.

  35. CI 2 2008 data 66.13%

  36. CI 2 2008 data revised 3 yr ave

  37. Core Indicator 3Persistence & Transfer Concentrators who were not leavers in the year following the cohort year or Transfers to CCC/CSU/UC/Privates (National Student Loan Clearinghouse) ÷ All Concentrators who were not completers with degrees or certificates (unless transferring) New

  38. CI 3 – 2008 data New

  39. CI 3 – 2008 data 92.95% 82.95% New

  40. Core Indicator 4Placement • Placement • Leavers and Completers • Minus Continuing in Two or Four Year Institutions • CCC or National Student Loan Clearinghouse • Employment 1st year out • UI wage file match • Employment any quarter in Academic Yr after cohort year • Apprenticeship, Military, Fed Gov

  41. CI 4 Placement : Measurement & Performance Levels Leavers & Completers in UI covered employment or Apprenticeship, Military, Fed Gov ÷ All Leavers & Completers *Based on Spring 2008 Perkins IV data.

  42. CI 4 2008 data 79.86% 71.87%

  43. CI 4 2008 data revised 72.82% 80.91%

  44. Core Indicator 5Gender Equity Programs leading to Nontraditional Fields (e.g., Men in Nursing – Women in Auto) 75% / 25% from 2000 census employment data • NAPE developed Nontraditional CIP table • Job codes (SOC) mapped to 2000 Census data • SOC codes mapped to CIP (USDE) • CIP codes mapped to TOP (CCC)

  45. Core Indicator 5Gender Equity Programs leading to Nontraditional Fields Nontraditional Gender Students ÷ All Students in NT Program

  46. CI 5a: Participation Measurement & Performance Levels Nontraditional participants enrolled in a Nontraditional TOP Code ÷All participants enrolled in a Nontraditional TOP Code * Based on Spring 2008 Perkins IV data.

  47. CI 5a 2008 data 19.32% 21.47%

  48. CI 5a 2008 data revised 21.63% 19.47%

  49. CI 5b Completion: Measurement & Performance Levels Nontraditional “completers” of nontraditional programs ÷ All “completers” of nontraditional programs * Based on Spring 2008 Perkins IV data.

  50. CI 5b – Spring 2008 data 23.28% 20.95%

More Related