1 / 44

The Academic Program Review

The Academic Program Review. Bridging Standards 7 and 14 Middle State’s Annual Conference December 10, 2010. Presenters. Mr. H. Leon Hill Director of Institutional Research Dr. Joan E. Brookshire Associate Vice President of Academic Affairs. Overview. Framework to Address the APRs

cissy
Download Presentation

The Academic Program Review

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Academic Program Review Bridging Standards 7 and 14 Middle State’s Annual Conference December 10, 2010

  2. Presenters Mr. H. Leon Hill Director of Institutional Research Dr. Joan E. Brookshire Associate Vice President of Academic Affairs

  3. Overview • Framework to Address the APRs • Structure/Challenges/Approach • Examples of Metrics • Current Action Plan • Integration of “End User” Technology • Next Steps • Benefits of Our Approach • Questions

  4. Assessment Cycle-2005

  5. What we had to build on • Strong focus on programs. • State mandated 5-year academic program review in need of revision. • Institutional Effectiveness Model (IEM) with performance indicators benchmarked through State and National data bases.

  6. Mission Strategic Initiative: Access & Success Institutional Effectiveness

  7. IEM • Needed a way to assess how the College was performing on key metrics in relation to prior. years/semesters and compared to other institutions. • Historical/Trend data • Benchmark data • Pennsylvania & National Peers

  8. Institutional Effectiveness Model

  9. Where we started • Restructured the Academic Program Review process • Incorporated the use of technology

  10. Goal of the restructuring • Measure student performance as evidence by results of assessment of student learning outcomes. • Measure program performance as evidenced by comparison of program performance to overall college performance on specific key indicator (current and aspirational).

  11. Challenges • Usual issues with assessment in general. • Faculty had little knowledge of the College’s performance indicators. • Organizational separation of assessment of institutional and student learning outcomes.

  12. Approach Began by building it backwards from the IEM by mapping out specific core indicators to program data, making additions where needed.

  13. Examples of Metrics Used for APR

  14. Definitions of Success & Retention Success=Grades of (A,B,C & P)/(A, B, C, D, P, D, F, & W) Retention=Grades of (W)/(A, B, C, D, P, D, F, & W)

  15. Added a curricular analysis • How well program goals support the college’s mission. • How well individual course outcomes reinforce program outcomes. • How well instruction aligns with the learning outcomes.

  16. Specific assessment results. • Changes made based on the assessment findings. • Evidence of closing the loop • Changes made to the assessment plan.

  17. Action Plan • Outcomes expected as a result of appropriate actions steps. • Timelines and persons responsible for each action step. • Resources needed with specific budget requests. • Evaluation plan with expected benefits.

  18. Bottom Line • Is there sufficient evidence that the program learning outcomes are being met? • Is there sufficient evidence that the program is aligned with the college on specific key indicators?

  19. The Framework Assessment Results Curriculum Committee President’s Office Curriculum BOT & BOT

  20. Addition of Technology • Worked in concert with Information Technology to integrate iStrategy with ERP (Datatel). • The implementation of this permitted end users to obtain the data needed for program assessment, without the middle man (IR and/or IT).

  21. Next Steps in the Evolution of of College and Program Outcomes

  22. Example of APR Report Card

  23. Examples of Course Success

  24. Success in ACC 111

  25. Success in ACC 111

  26. Success in Math 010

  27. Success in Math 010

  28. Benefits • Build a bridge between Standards 7 and 14. • Better data. • By putting data in the hands of faculty, have them actively engaged with using data in decisions/planning. • IR time better used.

  29. Annual planning cycle developed. • Built a culture of assessment in several of the academic divisions. • Curricular changes that align with graduation initiative. • Curricular and program improvement. • Created a college-wide model for improvement of student learning.

  30. Evolution of the Dashboard • Creation of a Student Success Dashboard Metrics: • Course level success and retention (Developmental and College-Level) • Persistence (fall to spring and fall to fall) • Progression of various cohorts of students • College level success in Math or English after Developmental Math or English • Graduation • Transfer

  31. Graphic Representation for the SSD

  32. Graphic Representation for the SSD

  33. Final Thoughts • It’s not perfect, but it works for us. • Do the research on which tools are appropriate for your college • Assessment of the core curriculum • Launching of assessment software • It all starts with asking the right question • PRR 2010

  34. Questions

  35. Presenters Mr. H. Leon Hill hlhill@mc3.edu Director of Institutional Research Dr. Joan E. Brookshire jbrooksh@mc3.edu Associate Vice President of Academic Affairs

More Related