1 / 11

Junay Adams jadams@sun.ac.za

FROM EVALUATION REPORTS TO ACTION – CLOSING THE LOOP WITH DEPARTMENTAL REVIEWS AT STELLENBOSCH UNIVERSITY. Junay Adams jadams@sun.ac.za. Outline of presentation. Introduction Overview: Core elements of review process Self-evaluation External review Follow-up reporting Resources used

orem
Download Presentation

Junay Adams jadams@sun.ac.za

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. FROM EVALUATION REPORTS TO ACTION – CLOSING THE LOOP WITH DEPARTMENTAL REVIEWS AT STELLENBOSCH UNIVERSITY Junay Adams jadams@sun.ac.za

  2. Outline of presentation • Introduction • Overview: Core elements of review process • Self-evaluation • External review • Follow-up reporting • Resources used • Resources generated • Categories of actions to be taken • Analysis • Deductions/conclusions

  3. 1. INTRODUCTION • Compulsory quinquennial external evaluation of academic departments 1992-1997 and 1998-2003 • Third cycle 2004-2009 • Academic as well as support environments • Objective of review – development and improvement • Purpose of the analysis • To determine whether objective of review processes are achieved • Do review processes just generate reports, or does it affect change/development/improvement? • Data sources • Reports of Quality Committee (2007-2009) (incl. summaries of external panel reports, response of unit, Dean’s response, QC’s response) • Results from 2009 feedback survey

  4. 2. Core elements of review process 2004-2009 • External panel appointed • Self-evaluation based on criteria • Academic departments – generic criteria • Support services – develop own • Report submitted to external panel • Panel conducts site • Panel report • Departmental /Division response • Quality Committee • Executive Committee of Senate • Follow-up report (after 2 years)

  5. 3. Resources used • More resources used than previous cycles • More detailed core statistics (incl. SMIs) • Support service reviews included in review framework • More international panel members • 2 previous cycles combined – only 5 • 2004-2009 cycle – 38 • More money spent • Cycle 1 – average R3500 per department • Cycle 2 – average R5200 per department • Cycle 3 – average R35 000 per department • Criteria more detailed • More administrative support provided • Assistance with development of criteria • Scribe services for external panels

  6. 4. Resources generated • On average, 4 reports generated per review • SE report • External panel report • Department/division’s formal response • Follow-up report • 75 departments and division reviewed 2004-2009 • Therefore 300 reports • Excluding reports of the Quality Committee and criteria documents developed by support units

  7. 5. CATEGORIES OF ACTIONS TO BE TAKEN • Category 1 – To be addressed by department/division • Within their control, e.g. • changes to curriculum • staff mentoring • publishing in accredited journals, etc • Category 2 – To be addressed by institution (management) • Cross-cutting issues • Requires inputs at central budgeting and planning level, e.g. • need for improved coordination of enrolment planning • Need for succession planning • Attention to physical facilities • strategies for recruiting black staff and students • challenges with regards to institutional culture, etc.

  8. 6(a). Analysis: CATEGORY 1 issues • Self-evaluation viewed as valuable by departments and divisions • 100% of respondents of feedback survey agree • Action taken even prior to external visit • “Anticipatory action” (Fredericks & de Haan, 1997) • However, not merely to please panel but for own conviction of value added • Agreement that external panels add value as well as international panel members • Thorough discussion of recommendations by unit, line managers and QC • “passive utilisation” (Fredricks & de Haan, 1997) • Incorporation of action plans into strategic planning of department/division (“active utilisation”) • 50% - strongly agree • 42% - agree • 8% - disagree • Value added justifies resources used • 25% - strongly agrees • 58% - agrees • 17% - disagrees

  9. 6(b). ANALYSIS: CATEGORY 2 ISSUES • Evidence of issues being addressed, e.g. • Development of faculty personnel plans to address succession planning and planning for diversity • Institutional facilities plan • Processes and structures in place for improved enrolment planning • Not clear whether these actions for change was due to review processes or other processes, e.g. • Budgeting and planning considerations • HEQC audit and plans identified in Quality Development Plan • Other central processes highlighting specific needs to be addressed

  10. 7. DEDUCTIONS /CONCLUSIONS Category 1 issues (addressed by department/division) • Much evidence of (value adding) action taken • Often direct correlation between actions taken and review process • Easier to trace “active utilization” to review process • Goal of review, i.e. improvement – achieved Category 2 issues (for institutional attention) • Evidence that most issues addressed to certain degree • Not clear whether changes necessarily due as result of review, but definitely increases institutional awareness along with other factors • But this is consonant with international research System efficiency • Generally, 3 building blocks and feedback structures are effective and should be maintained in next cycle • But not necessarily efficient – should be more streamlined, focused

  11. THANK YOU • QUESTIONS?

More Related