1 / 31

Academic Productivity and Efficiency Review at NC State

Academic Productivity and Efficiency Review at NC State. Program Assessment: A Three-Phase Process. External Review – completed on a periodic basis Internal Review - continuous and ongoing outcomes based assessment Program Productivity & Efficiency – for Institutional strategic planning.

babu
Download Presentation

Academic Productivity and Efficiency Review at NC State

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Academic Productivity and Efficiency Review at NC State

  2. Program Assessment: A Three-Phase Process • External Review – completed on a periodic basis • Internal Review - continuous and ongoing outcomes based assessment • Program Productivity & Efficiency – for Institutional strategic planning

  3. External Pressures - State and System Climate • Recurring Budget Reductions • Performance Metrics Informing New Funding • Primarily undergraduate • New Academic Planning Guidelines • Academic Program Productivity Review • Biennial Productivity Review • Number of majors and graduates • Island and Satellite Analysis

  4. Internal Situation - Strategic Realignment Recommendations Administrative Consolidation • Consolidate Equity and Diversity Offices • Discontinue Office of Extension, Engagement and Economic Development • Merge Undergraduate Academic Programs and Student Affairs Business Services • Modify Reporting Lines • Create Business Operations Centers Academic Programs Review Summer Education Review Distance Education Review Academic Science Programs Modify Academic Planning Process Review Academic Program Efficiency and Effectiveness Organizational Bureaucracy • Review Administrative Processes for Efficiency • Review Policies, Regulations and Rules

  5. Our Realities (Why We Need to Prioritize) • Programs have grown in number and size without a critical assessment of need, productivity and efficiency • Across-the-board cuts = Mediocrity • Not all programs are equal • Some are more (or less) productive or efficient • Some are more central to the mission • The most likely source for needed resources is reallocation of existing resources • Reallocation requires responsible prioritization

  6. Preliminary Analysis – Spring 2011 Pilot “Toward Greater Effectiveness and Efficiency in Academic Courses and Programs” • Low-enrolled courses • Recommendations: Inactivate courses not taught in the last 5 years; re-establish minimum class sizes • Undergraduate programs • Evaluated 5 variables • 29 programs “flagged” for further review • Graduate programs • Evaluated 10 variables • 17 Master’s programs and 15 Doctoral programs “flagged” for further review

  7. Review of Academic Programs 14-member Task Force and 8-member Support Team (Co-chairs: John Ambrose & Duane Larick) Charged: May 2011 Report to Provost: May 2012 (delivered October, 2012) More info: go.ncsu.edu/academic-program-review

  8. The Book on Program Prioritization Robert C. Dickeson, Prioritizing Academic Programs and Services: Reallocating Resources to Achieve Strategic Balance, 1999, revised and updated 2010. • Jossey-Bass, John Wiley & Sons, Inc.

  9. Steps Involved in the Process (Dickeson, 2010) • Define the “programs” (academic and non-academic alike) • Establish criteria • Select weights for each criteria • Collect data • Score each program against the criteria • Rank the programs • Decide what to do with each program • Do it!

  10. Recommended Criteria (Dickeson, 2010) • History, development and expectations of the program • External demand • Internal demand • Quality of program inputs and processes • Quality of program outcomes • Size, scope and productivity of the program • Revenue and other resources generated by the program • Cost and other expenses generated by the program • Impact, justification and essentiality of the program • Opportunity analysis of the program

  11. Questions We Set Out To Answer • Which programs are the most and least productive? • Which programs are the most and least effective in graduating students in a timely manner? • Which programs have the most and least demand for its degrees? • Which programs were the most and least efficient in the use of faculty resources?

  12. Review of Academic Programs Guiding principles • The process must be open and effectively communicated to stakeholders • No data set is perfect • Not all departments and programs are measured equally well by each metric • All metrics must be clearly defined and the TF should help stakeholders understand the data • The choice of metrics will likely influence future behaviors

  13. Review of Academic Programs Guiding principles Both quantitative and qualitative datashould be used - no single or group of metrics can be used to identify actions to be taken, TF should add judgment in recommending actions The need for transparency must be balanced with the need to avoid putting programs in jeopardy Bringing programs and resources into balance will promote quality Connecting our planning and performance assessment will advance our strategic goals

  14. Metrics • Unit of analysis • academic program and/or department • Quantitative and qualitative • Institutional data provided by Task Force for program verification • Other information requested from departments via survey • Metric categories: • Undergraduate programs • some at Department and some at degree level • Graduate programs • MR & DR evaluated separately • Evaluated at program level • Background metrics • Expenditure data

  15. Departmental/Program Survey Request for additional information • Peer-reviewed publications and other scholarly contributions per faculty FTE • Placement rates for MR and DR graduates in jobs, graduate school, or as postdocs • Description of program’s synergy with NC State mission • Any special circumstances that make the department unique and is not captured in the university-level data • Any additional narrative info about the program that should be taken into consideration

  16. Doctoral Programs Sample Data Top quartile Bottom quartile

  17. Sample Scorecard Graduate Dept. Scorecard - 0000000 (OUC Code) - Name of Department College=00 Department=000000 Program Code=XX Graduate Background Measures - 0000000 (OUC Code) - Name of Department

  18. Academic Program Review Process

  19. Task Force Outcomes • Program Level: Recommendations including changes in focus, consolidation or elimination • College Level: Recommendations resulting from number and size of programs (opportunities for consolidation) • University Level: Recommendations in areas such as internal transfers (UG) retention, and 6-year graduation rates (DR)

  20. Progress To Date • Analysis and report reviewed with leadership of each College • Anticipate on-going discussions between the Provost, College Deans and Programs • Deans were asked to provide an update on the process for and progress on responding to the data and recommendations in the Task Force report • Data/Analysis being institutionalized into the annual review process for Colleges/College Deans

  21. Challenges: Determining Scope of Review • What is a “program” (i.e. the unit of measure) • Academic programs only • Undergraduate and Graduate • All academic programs (BS and BA; Minors, Certificates, etc.) • Departmental verses Interdepartmental and Interdisciplinary programs, etc.

  22. Challenges: Metrics and Data • All metrics must be clearly defined so stakeholders can understand the data • Much of the data you want/need may not be available • Maintenance of the database for the future

  23. Challenges: Campus Buy-in/Acceptance • Faculty/Program Resistance • “What’s to become of the affected faculty?” • “What’s to become of the affected students?” • “Shouldn’t the administration have to prioritize its programs, as well?” • “Let us keep this program – it doesn’t really take any resources.”

  24. Lessons Learned - Data • No data set is perfect – don’t let that derail the effort • Not all departments and programs are measured equally well by each metric – accept that • All metrics must be clearly defined so stakeholders can understand the data • Both quantitative and qualitative data should be used - no single or group of metrics can be used to identify actions to be taken; judgment is needed in recommending actions

  25. Lessons Learned - process • Institutional commitment is needed from the start • What is the overriding goal of this effort? • Must have a plan for how the outcomes will be used and who will be charged with decision making

  26. Lessons Learned - process • Institutional commitment is needed from the start • What is the overriding goal of this effort? • Must have a plan for how the outcomes will be used and who will be charged with decision making

  27. Lessons Learned - process • Have a plan for engaging “programs” and faculty from the beginning • Must be transparent • Have a communication plan from the beginning • The process will take more time than planned • Planning, collecting, analyzing, reporting and acting will take time • Must include an opportunity for review by programs prior to final recommendations • A follow-up plan is needed

  28. University Of XYZ Faculty Push Back On Program Cuts • University Business Magazine Headline: University of XYZ to eliminate as many as 55 academic programs caught many faculty members by surprise • “I saw it on the news this afternoon, but I got an email yesterday.” • “The list put out falsely implies the faculty had voted to end some programs when in fact what wehad wanted to do was to suspend admissions to fix the program. That’s what surprised me because I didn’t expect that kind of misunderstanding.” • “What we have to do is to say, ‘look guys we need to get together and we need to figure out how to address this,’ and to do it quickly.”

  29. QUESTIONS and DISCUSSION

More Related