1 / 22

How an Assessment Framework helped revitalize Program Review at JCCC

Learn how the Assessment Framework at JCCC helped improve program review by focusing on student success and aligning program needs with campus priorities.

Download Presentation

How an Assessment Framework helped revitalize Program Review at JCCC

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. How an Assessment Framework helped revitalize Program Review at JCCC Bill Robinson, Professor, Mathematics Sheri Barrett, Dir., Office of Outcomes Assessment

  2. PROGRAM REVIEW!!!!

  3. Assessment Cycle from JCCC Assessment Guide

  4. Primary Goals • Ensure that academic programs remain focused on student success and serving the needs of the community; • Enhance the resources and quality of academic programs by assessing program strengths and challenges (Continuous Quality Improvement); • Align academic program needs and campus priorities with the planning and budgeting process; and • Ensure that program priorities are consistent with the college's mission and strategic plan.

  5. Genesis of the Project • Ad Hoc Task Force worked on a template but did not implement it on campus • Higher Learning Commission Meeting • Program Review is an expected practice • AQIP Project • Task Force – one year timeline

  6. Task Force Work • Revisited the work of the Ad Hoc group • Held discussions with chairs, Faculty Senate, Deans on expectations, review, comments • Revised the original template to incorporate what we heard

  7. Task Force Work • Identify Technology Options to Support Program Review • Determine a Structure of Implementation • Early pilot • Timelines • Training • Order of participation

  8. Task Force Work • Explore ways to “close the loop” on the Program Review Process • Looked for examples from other community colleges • Recommend new template that had built in feedback mechanism • Integrate the budget process into Program Review

  9. Comprehensive Academic Program Review • Philosophy that guided development • Program Quality • Regular and Effective Process • Data and Information • Analysis and Use of Data and Information

  10. Program Review Elements • Data Elements • Student Success • Co-Curricular Mapping • Assessment of Student Learning Outcomes • Curriculum Processes • Revisions • Honors • New Offerings • Faculty Success • Long/Short Term Goal Setting • Ties to KPIs • External Constituency & Significant Trends • Advisory Boards • Specialized Accreditation • Reflection on Data and Trends • Resource Requests (Budgeting)

  11. Data Elements • Ratio of faculty to SFTE • # of Full-time/Part-time Faculty • Placement rates (Employment rates in the field) • Transfer rates • Degree/Certificate Completion • Course Attrition/Completions • Student Success (A, B,C or Pass) • Program Costs per Credit Hour • Costs per FTE

  12. Program Review Cycle

  13. Progress and Revisions • Piloted the new process with 8 “volunteers” • Reading • English • Journalism and Media Communications • Paralegal • Early Childhood Education • EAP (English for Academic Purposes) • HPER (Health, Physical Education and Recreation) • Biotechnology

  14. Feedback From Pilots & Ongoing • Approaches used by pilots • What worked • What didn’t work • Ongoing Lessons learned • Revisions to the data • Review of campus wide processes • Inclusion of Annual Piece

  15. Program Review Committee • Faculty elected from the Divisions • 3 representatives appointed by the Provost • Provide collegial feedback to the programs in comprehensive review

  16. Vitality Indicators • First Self Assessment by Programs • Demand • Quality • Resource Utilization • Overall Vitality Recommendation • Summary Assessment by Divisional Deans

  17. Logistics • Software to support processes • Guidelines for PRC • Templates • Support for the Programs • Website • Training • Handbook • Videos

  18. Closing the Loop • Feedback from PRC • Opportunity for Programs to revise submission • Complete Self Evaluation – Vitality • Deans Read and complete Vitality Recommendation • Meet with Program to review recommendations • Vitality Recommendations Consolidated and used by Instructional Deans Council in allocating faculty position and resources

  19. Program Review Administrative Support • Role of OOA in Program Review • Support Office/Answer Questions, Support PRC • Facilitate requests for Additional Data • Makes results public • Institutional Research • Data provided for departments • Budget Office • Budget Data and Questions about budget info

  20. Administrative Program Review • Launched as a pilot in 2015-16 • Launched campus-wide Spring 2017 • Integrates work of the Academic Program Review • Common philosophy • Framework for reporting • Supported by Institutional Effectiveness

  21. Questions Bill Robinson, Professor Mathematics Division Sheri Barrett, EdD Director, Office of Outcomes Assessment

More Related