Loading in 2 Seconds...
Loading in 2 Seconds...
An Australian participant view of the outcomes of the Performance Based Research Fund (PBRF) in New Zealand. Peter J. Dowling Victoria University of Wellington New Zealand. Some introductory comments about the PBRF in New Zealand universities.
Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.
Peter J. Dowling
Victoria University of Wellington
The PBRF is a mixed performance assessment regime based on three components:
The term “NE” represents a “new or emerging” researcher
Academic staff members are required to complete a Evidence Portfolio which has the following components:
Research Output component (RO):4 Nominated Research Outputs (ie. best research examples) from the last 5 years and up to 30 publications from the last 5 years (70% weighting for EP score)
Comment on any Special Circumstances which have had an impact on research output and the Panel should be aware of.
Peer Esteem component (PE): recognition of work by peers (15% weighting for EP score)
Contribution to the Research Environment component (CRE): eg. supervision of RHD students, receipt of research grants (15% weighting for EP score)
Each component is scored from 0-7
A number of NZ universities are considering whether they should declare some key areas of research strength (existing or planned) and allocate additional resources at the expense of poorly performing subject areas.
Most universities are carefully analyzing the results of the 2006 PBRF and deciding how they will manage their approach to the 2012 PBRF. Internal lists of “winners and losers” are under close discussion with reviews of “losers” proposed at some institutions.
A key problem area identified is the number of staff at the Senior Lecturer and Professorial levels who have been rated as R or C. Various early retirement schemes (with possible ‘teaching only’ contracts offered to some staff) are under discussion.
The effectiveness of current HR processes involved in appointment, probation, study leave and promotion of academic staff is being debated. For example, several universities after the 2003 PBRF round moved to a policy of not appointing new staff who would not score at least a C in the 2006 round. There are a considerable number of industrial relations implications involved in these analyses.
The strategies of universities which successfully increased their PBRF scores between 2003 and 2006 are being closely studied. For example, some universities in 2006 continued to appoint academic staff who were likely to be rated as an “R” right up to the audited PBRF census date of 14 June. Others have put considerable resources into raiding academic staff from higher ranked institutions to increase their future PBRF scores.
The “winners” of the 2006 PBRF race have been quick to use these results in their institutional marketing. However, it is unclear how much attention school leavers pay to institutional research performance when comparing universities.
In response to an Official Information Act request, it has been recently revealed that nearly 100 academics were classified as ineligible for the 2006 Performance-based Research Fund Quality Evaluation because they were deemed to be under "strict supervision". The strict supervision clause has been criticised as a mechanism for excluding staff from the Quality Evaluation process who would normally have been rated "R".
One clear outcome of the PBRF process is differential government funding for research. There is general agreement that the implications for the higher education sector of an increasingly competitive research funding environment represent a considerable challenge to New Zealand universities.