1 / 69

AY 2012 CTE Instructional Program Review Process

AY 2012 CTE Instructional Program Review Process. Data Description Process – Timeline Rev. 10-15-12. Purpose.

rlomax
Download Presentation

AY 2012 CTE Instructional Program Review Process

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. AY 2012 CTE Instructional Program Review Process Data Description Process – Timeline Rev. 10-15-12

  2. Purpose • The purpose of this presentation is describe the process we follow for our local Comprehensive Program Review, the system required Annual Program Reviews, and to describe how the data is derived. • We have been asked to produce an annual program review for each and every one of our instructional programs and units. They are required of each community college in the system and will be taken to the U of H Board of Regents for their purview. • If you are normally scheduled to do a comprehensive review or are “Jumping”, you will need to complete a comprehensive review this year. Additionally, every instructional and non-instructional program will do an Annual Program Review this year. • Not sure if you’re scheduled for a comprehensive review this year? Find out by clicking here for the Comprehensive Program-Unit Review Cycle and Schedule

  3. Reason for Program Review • Program Review is Assessment • The review of a program should be an on-going, year-round, reflective process.  • Program review processes assure educational quality and help programs to evaluate and improve their services. • Program review is an opportunity for self-study, self-renewal, and an opportunity to identify the need for improvement. • A robust program review process is one mechanism available to the college to improve student success.

  4. What are we doing to improve our program review process? Upon conclusion of every program/unit review cycle, the IR Office takes extra care to ensure that we are improving our program/unit review process on campus. This is accomplished by sending out questionnaires specific to the groups, and by meeting with the various groups across campus and collecting their feedback. Your suggestions for improving this process are then published to the Program Review website and have been linked here for your convenience. AY 2011 Program/Unit Process Improvement Summary Based upon the feedback we received from everyone last year from our program-unit review process improvement focus groups, the following changes have been made and have been incorporated into the planning of this year’s review: • In order to assure the best possible attendance for our annual training, the VC for Academic Affairs will notify the Chancellor, VC’s, and Directors, when training is ready, and the training will be scheduled through the VCAA’s office and Secretaries. It will be the responsibility of the VC’s and Directors to disseminate the day, time, and location of training to Writers and Initiators. • In order to improve navigation on the program review website, our web developer has reorganized the current site and will be responsible for publishing documentation going forward. • To provide the best possible support for training on campus we will continue to have 3 separate training sessions this year, one for Units, one for Liberal Arts, and one for CTE programs.

  5. What else are we doing to improve our program review process? • There were some performance issues with the online tool last year reported by Writers. Joni has taken your concerns to the appropriate party at the system office and has asked them to try to work some of the issues out. • You said that the training you received last year regarding changes to the Instructional Program Review Comprehensive Template was helpful. We will continue to provide a brief overview of the templates upon conclusion of our normally scheduled training. • All annual and comprehensive templates have been updated since last year based on your feedback, and on an evaluation of the comprehensive review templates by CERC. • An additional step to our Comprehensive Program/Unit Review Process document was added to ensure timely delivery of the budgetary data needed to complete an instructional program review. • An additional step to our Comprehensive Program/Unit Review Process document was added to ensure that VC’s and Directors took more ownership for our local program/unit review process by providing training specific to their groups (i.e. the Vice Chancellor for Academic Affairs would provide training to the Academic Support Group) • An additional step to our Comprehensive Program/Unit Review Process document was added to ensure that all of the suggested improvements we commit to every year through our Program/Unit Process Improvement efforts are completed by the responsible party prior to leaving for Summer. This is the improved process we are using this year: Comprehensive Program-Unit Review Process

  6. What is different this year? • Program Student Learning Outcomes are now required for all CTE Programs in the community college system. • Under the online Web Submission “External” tab there is a new requirement for the entry of the number sitting and number passing External Licensing Exams. • Under the online Web Submission “Cost per SSH” tab there is a new requirement for the entry of Tuition and fees, which are now part of the Overall Program Budget Allocation for all instructional programs. Joni will be uploading all of the budget information you need this year. • Under the online Web Submission “Description” tab there is a new requirement for entry of the web address to the location of your last comprehensive review. Also required is the date completed. Please link your last comprehensive review here—not this year’s comprehensive.

  7. What else is different this year? • Academic Subject Certificates were included last year as part of your count for certificates awarded. This year it is grouped under “other certificates awarded.” • There is a different certificate included in your data this year as part of the count for certificates awarded. It is the Advanced Professional Certificate. • Annual Reports of Program Data are completely on-line this year for Academic Support. • The number of new and replacement positions for your programs this year will be determined by the CIP code assigned by IRAO. • Data elements with asterisks next to them are now used for health calls. • In the online Web Submission section there is a new “P-SLO” tab, which requires the following new information: Evidence of Industry Validation (for CTE programs), Expected Level of Achievement, Courses Assessed, Assessment Strategy/Instrument, Results of Program Assessment, Other Comments, and Next Steps.(see following slide for definitions)

  8. What to enter under the P-SLO tab • Evidence of Industry Validation Provide documentation that the program has submitted evidence and achieved certification or accreditation from an organization granting certification in an industry or profession.  If the program/degree/certificate does not have a certifying body, the recommendations for, approval of, and/or participation in, assessment by the program’s advisory committee/board can be submitted. • Expected Level of Achievement Describe the different levels of achievement for each characteristic of the learning outcome(s) that were assessed.  What represented “excellent,” “good,” “fair,” or “poor” performance using a defined rubric and what percentages were set as goals for student success (for example: “85% of students will achieve good or excellent in the assessed activity.”) • Courses Assessed List the courses assessed during the reporting period. • Assessment Strategy/Instrument Describe what, why, where, when, and from whom assessment artifacts were collected. • Results of Program Assessment The % of students who met the outcome(s) and at what level they met the outcome(s). • Other Comments Include any information that will clarify the assessment process report. • Next Steps Describe what the program will do to improve the results.  "Next Steps" can include revision to syllabi, curriculum, teaching methods, student support, and other options.

  9. Jumper Defined • A Jumper is a locally defined term that is used to describe an instructional program or non-instructional program (unit) that has decided to jump out of their normally scheduled slot for their comprehensive reviews and into this years cycle. • Jumping into this years comprehensive cycle means that you will have an opportunity to be considered for any budgetary decisions that will be made in this years budget process. • Jumpers will still have to do their comprehensive review on their next scheduled review. Jumping does not affect the existing schedule—you are voluntarily doing an extra review to be considered in this budget cycle.

  10. I belong to an Instructional Program… which template do I use? COMPREHENSIVE REVIEWS Comprehensive Instruction Program Review Template (Use this template ONLY if you are scheduled for a comprehensive program review this year or are jumping) ------------------------------------------------------------------------------------------------------------------- ANNUAL REVIEWS Your programs data table is available on-line using the link below. You should have everything you need to begin writing your review within the web submission tool. Plan to save your work often—especially when switching between screens, and plan to do most of your formatting within the tool if you are copying and pasting in from Word. UHCC Annual Report of Program Data Web Submission Tool (ALL Instructional programs will need to complete this—even if you’re completing a comprehensive review this year)

  11. Terminology / Timing • The Census freeze event is the fifth Friday after the first day of instruction. • The End of semester freeze event is 10 weeks after the last day of instruction. • FISCAL_YR_IRO: Fiscal year, where the value indicates the ending of the fiscal year. For example, a FISCAL_YR_IRO value of 2005 indicates the fiscal year 2004-2005 (July 1, 2004 to June 30, 2005) which includes Summer 2004, Fall 2004, and Spring 2005 semesters.

  12. Instructional Program Review Data Elements • Student information data this year comes exclusively from the Operational Data Store (ODS). Organizationally this means that all community colleges are getting the data from the same place and at the same snapshot in time (this is a good thing). • The following slides will explain in detail what data has been provided to you for your comprehensive-annual instructional program review write ups and how it has been calculated.

  13. #1 New and Replacement Positions (State) • Economic Modeling Specialists Inc. (EMSI) annual new and replacement jobs at state level. Compiles data based on Standard Occupational Classification (SOC) codes aligned to the program’s Classification of Instructional Programs (CIP) codes. • Data based on annual new/replacement position projections as of Spring 2012 • State position numbers are not pro‐rated. • Note: In the past programs could select SOC codes and some programs had more than one.  This year they are using the official CIP code assigned by IRAO. Click on the link below for a list of the CIP codes that were used to pull the labor data.  The SOC codes that are associated with this CIP code is actually where the data is coming from. AY 2012 HawCC CIP Code Listing

  14. #2 New and Replacement Positions (County prorated) • Economic Modeling Specialists Inc. (EMSI) annual new and replacement jobs at county level. Compiles data based on Standard Occupational Classification (SOC) codes aligned to the program’s Classification of Instructional Program (CIP) codes. • Note: It is possible for the number of new and replacement positions in the county to be higher than the state if the projection in other counties is for a loss of new and replacement positions. • County data pro‐rated to reflect number of programs aligned to the SOC code and weighted by number of majors in each program/institution for programs that share SOC codes. • Data based on annual new/replacement position projections as of Spring 2012.

  15. #3 Number of Majors • Count of program majors who are home‐institution at your college. Count excludes students that have completely withdrawn from the semester at CENSUS. • This is an annual number. Programs receive a count of .5 for each term (fall and spring) within the academic year that the student is a major. A maximum count of 1.0 (one) for each student.

  16. #4 SSH Program majors in Program Classes • The sum of Fall and Spring SSH taken by program majors in courses linked to the program. Captured at Census and excludes students who have already withdrawn (W) at this point. • Note: for programs where year‐round attendance is mandatory, Summer SSH are included. • Excludes Directed Studies (99 series). Differs from MAPS as UHCC data includes Cooperative Education (93 series) as there is a resource cost to the program. • Not sure what your program classes are? Click here to find out: Courses Taught Aligned to Instructional Programs

  17. #5 SSH Non-Majors in Program Classes • The sum of Fall and Spring SSH taken by non‐program majors (not counted in #4) in courses linked to the program. Captured at Census and excludes students who have already withdrawn (W) at this point. • Note: for programs where year‐round attendance is mandatory, Summer SSH are included • Excludes Directed Studies (99 series). Differs from MAPS as UHCC data Includes Cooperative Education (93 series) as there is a resource cost to the program.

  18. #6 SSH in All Program Classes • The sum of Fall and Spring SSH taken by all students in classes linked to the program. Captured at Census and excludes students who have already withdrawn (W) at this point. • Note: for programs where year‐round attendance is mandatory, Summer SSH are included. • Excludes Directed Studies (99 series). Differs from MAPS as UHCC data Includes Cooperative Education (93 series) as there is a resource cost to the program.

  19. #7 FTE Enrollment in Program Classes • Sum of Student Semester Hours (SSH) taken by all students in classes linked to the program (#6) divided by 30. Undergraduate, lower division Full Time Equivalent (FTE) is calculated as 15 credits per term. • Captured at Census and excludes students who have already withdrawn (W) at this point. • Note: for programs where year‐round attendance is mandatory, summer SSH are included.

  20. #8 Total Number of Classes Taught • Total number of classes taught in Fall and Spring that are linked to the program. Includes Summer classes if year‐round attendance is mandatory. • Concurrent and Cross listed classes are only counted once for the primary class. • Excludes Directed Studies (99 series). Differs from MAPS as UHCC data Includes Cooperative Education (93 series) as there is a resource cost to the program.

  21. CTE Program Scoring Rubric Definitions Your program health is determined by 3 separate types of measures: Demand, Efficiency, and Effectiveness. This slide explains why these measures were chosen to determine program health. • Demand: A seeking or state of being sought after. i.e. your programs ability to attract new students every year based on your offering. • Efficiency: Acting or producing effectively with a minimum of waste, expense, or unnecessary effort. i.e. your programs ability to use its resources in the best possible way. • Effectiveness: Stresses the actual production of, or the power to produce an affect. i.e. your programs ability to produce the desired result.

  22. Determination of program’s health based on demand • This year the system office will calculate and report health calls for all instructional programs using academic year 2012 data. The following instructions illustrate how those calls are made. • Program Demand is determined by taking the number of majors (#3) and dividing them by the number of New and Replacement Positions by County (#2). • The following benchmarks are used to determine demand health: Healthy: 1.5 - 4.0 Cautionary: .5 – 1.49; 4.1 – 5.0 Unhealthy: <.5; >5.0 • Finally, an Overall Category Health Score is assigned where: 2 = Healthy 1= Cautionary 0= Unhealthy

  23. #9 Average Class Size • Total number of students actively registered in Fall and Spring program classes divided by classes taught (#8). Does not include students who have already withdrawn from the class by Census. • Excludes Directed Studies (99 series). Differs from MAPS as UHCC data. Includes Cooperative Education (93 series) as there is a resource cost to the program.

  24. #10 Fill Rate • Total active student registrations in program classes (number of seats filled) at Fall and Spring census divided by the maximum enrollment (number of seats offered). • Captured at Census and excludes students who have already withdrawn (W) at this point.

  25. #11 FTE BOR Appointed Faculty • Sum appointments (1.0, 0.5, etc.) of all BOR appointed program faculty (excludes lecturers and other non BOR appointees). • Uses the “hiring status” of the faculty member – not the teaching/work load. • Uses the Employing Agency Code (EAC) recorded in the Human Resources (HR) database to determine faculty’s program home. • Data as of 10/31/2011 • Data provided by UH Human Resources Office. • Click here for the count of BOR Appointed Program Faculty in your program: 2012 BOR Appointed Program Faculty

  26. #12 Majors to FTE BOR Appointed Faculty • Number of majors (#3) divided by sum appointments (#11) (1.0, 0.5, etc.) of all BOR appointed program faculty. • Data show the number of student majors in the program for each faculty member (25 majors to 1 faculty shown as “25”)

  27. #13 Majors to Analytic FTE Faculty • Number of majors (#3) divided by number of Analytic FTE faculty (13a).

  28. #13a Analytical FTE Faculty (Workload) • Calculated by sum of Semester Hours (not Student Semester Hours) taught in program classes divided by 27. • Analytic FTE is useful as a comparison to FTE of BOR appointed faculty (#11). Used for analysis of program offerings covered by lecturers.

  29. #14 OverallProgram Budget Allocation • The overall program budget allocation = General Funded Budget Allocations (14a) + Special/Federal Budget Allocations (14b) + Tuition and Fees (14c) + Other fees • The overall program budget allocation is automatically calculated when you enter your general funded budget allocation, special/federal budget allocation, other funds, or tuition and fees, into the online tool tab called, “Cost per SSH.” Again, Joni will upload the data for you this year. • The overall program budget allocation is to be determined by the College and it includes: Salaries (general funds, special funds, etc.), overload, lecturers, costs for all faculty and staff assigned to the program, supply and maintenance, and tuition and fees.

  30. #14a General Funded Budget Allocation • The general funded budget allocation = actual personnel costs + b budget expenditures

  31. #14b Special/Federal Budget Allocation • The expenditure of dollars from Federal grants

  32. #14c Tuition and Fees • New this year to be included in your programs overall budget allocation is the amount collected for tuition and fees in the 2012 academic year.

  33. #15 Cost per SSH • Overall Program Budget Allocation (#14) divided by SSH in all program classes (#6) • This value is automatically calculated for you when you enter your general funded budget allocation, special/federal budget allocation, other funds, or tuition and fees, into the online tool tab called, “Cost per SSH.”

  34. #16 Number of Low Enrolled (<10) Classes • Classes taught (#8) with 9 or fewer active students at Census. • Excludes students who have already withdrawn (W) at this point. • Excludes Directed Studies (99 series). • Includes Cooperative Education (93 series) as there is a resource cost to the program.

  35. Determination of program’s health based on efficiency • This year the system office will calculate and report health calls for all instructional programs using AY 2012 data. The following instructions illustrate how those calls are made. • Program Efficiency is calculated using 2 separate measures…Fill rate (#10), and Majors to FTE BOR Appointed Faculty (#12). • The following benchmarks are used to determine health for Fill Rate: Healthy: 75 – 100% Cautionary: 60 – 74% Unhealthy: < 60% • An Overall Category Health Score is assigned where: 2 = Healthy 1 = Cautionary 0 = Unhealthy

  36. Determination of program’s health based on efficiency cont… • The following benchmarks are used to determine health for Majors/FTE BOR Appointed Faculty : Healthy: 15 - 35 Cautionary: 30 – 60; 7 - 14 Unhealthy: 61 +; 6 or fewer • An Overall Category Health Score is assigned where: 2 = Healthy 1 = Cautionary 0 = Unhealthy • Finally, average the 2 overall health scores for Class fill rate and Majors/FTE BOR Appointed Faculty then use the following rubric: 1.5 - 2 = Healthy .5 - 1 = Cautionary 0 = Unhealthy

  37. #17 Successful Completion (Equivalent C or higher) • Percentage of students actively enrolled in program classes at Fall and Spring census who at end of semester have earned a grade equivalent to C or higher.

  38. #18 Withdrawals (grade = W) • Number of students actively enrolled (at this point have not withdrawn) at Fall and Spring census who at end of semester have a grade of W.

  39. #19 Persistence Fall to Spring • Count of students who are majors in program at fall census (from Fall semester #3) and at subsequent Spring semester census are enrolled and are still majors in the program. • Example: 31 majors start in Fall 21 majors of the original 31 persist into Spring 21/31 = .6774 or 67.74%

  40. #20 Unduplicated Degrees/Certificates Awarded • Unduplicated headcount of students in the fiscal year reported to whom a program degree or any certificate has been conferred. (Sum of 20a, 20b, 20c, and 20d). • [sum of 20a, 20b, 20c, and 20d, then unduplicated] • Uses most recent available freeze of fiscal year data. • For ARPD year 2010, the most recent fiscal data on August 15, 2010, was from FY 2010.  • For ARPD year 2011, the most recent fiscal data on August 15, 2011, was from FY 2011. • For ARPD year 2012, the most recent fiscal data on August 15, 2012, was from FY 2012.

  41. #20a Number of Degrees Awarded • Degrees conferred in the FISCAL_YEAR_IRO. • The count is of degrees and may show duplicate degrees received in the program by the same student if the program offers more than one degree. • Uses most recent available freeze of fiscal year data. • FISCAL_YEAR_IRO: “Fiscal year, where the value indicates the ending of the fiscal year. For example, a FISCAL_YR_IRO value of 2005 indicates the fiscal year 2004‐2005 (July 1, 2004 to June 30, 2005) which includes Summer 2004, Fall 2004, and Spring 2005 semesters…”

  42. #20b Certificates of Achievement Awarded • Certificates of achievement conferred in the FISCAL_YEAR_IRO. • The count is of program certificates of achievement and may show multiple certificates of achievement in the same program received by the same student. • Uses most recent available freeze of fiscal year data. • FISCAL_YEAR_IRO: “Fiscal year, where the value indicates the ending of the fiscal year. For example, a FISCAL_YR_IRO value of 2005 indicates the fiscal year 2004‐2005 (July 1, 2004 to June 30, 2005) which includes Summer 2004, Fall 2004, and Spring 2005 semesters…”

  43. #20c Advanced Professional Certificates Awarded • The count is of program Advanced Professional Certificates and may show multiple Advanced Professional Certificates in the same program received by the student. • Uses most recent available freeze of fiscal year data.

  44. #20d OtherCertificates Awarded • The count is of other program certificates (such as the Academic Subject Certificate) and will show multiples received by the same student. • Uses most recent available freeze of fiscal year data.

  45. #21 External Licensing Exams Passed • These are data to be entered by the college (the same information currently collected on the Graduates and Leavers surveys). Note: Not all CTE programs have external licensing exams. • This value will be automatically calculated when you click on the “External” tab within the online tool web submission. Just click the edit button for your program, then enter the number sitting for the exam and the number that passed. Then just click on the button, “Save External Data.”

  46. #22 Transfers to UH 4-yr programs • Students with home campus UH Manoa, UH Hilo, or UH West Oahu for the first time in the reporting Fall who prior to the reporting Fall had UH community college as home campus. • Also includes students who for the first time in the reporting Fall had Maui College as home campus and major ABIT or ENGT. • This is a program measure. A student is included in the count of program transfers in as many programs in which they have been a major at the college.

  47. #23 Transfers with credential from program • Students included in #22 who have received a degree from the community college program prior to transfer. • Does not include any certificates.

  48. #22b Transfers without credential from program • Students included in #22 who did not receive a degree from the community college program prior to transfer.

  49. Determination of program’s health based on effectiveness • This year the system office will calculate and report health calls for all instructional programs using academic year 2012 data. The following instructions illustrate how those calls are made. • Program Effectiveness is calculated using 3 separate measures: Unduplicated Degrees/Certificates Awarded (#20) / Majors (#3), Unduplicated Degrees/Certificates Awarded (#20) / Annual new and replacement positions (County prorated) (#2), and Persistence Fall to Spring (#19). • The following benchmarks are used to determine health for Unduplicated Degrees/Certificates Awarded per major: Healthy: > 20% Cautionary: 15 - 20% Unhealthy: < 15% • An Overall Category Health Score is assigned where: 2 = Healthy 1= Cautionary 0= Unhealthy

  50. Determination of program’s health based on effectiveness cont… • The second measure used to determine health is Unduplicated Degrees/Certificates Awarded (#20) / Annual new and replacement positions (County prorated) (#2). • The following benchmarks are used to for this measure: Healthy: .75 – 1.5 Cautionary: .25 - .75 and 1.5 – 3.0 Unhealthy: < .25 and >3.0 • An Overall Category Health Score is assigned where: 2 = Healthy 1 = Cautionary 0 = Unhealthy

More Related