1 / 39

Benchmarking in HE: the HESA benchmarking project Graham Fice Consultant

Benchmarking in HE: the HESA benchmarking project Graham Fice Consultant. Outline of presentation. Background and project deliverables: HEFCE funded Planning and data focus Other benchmarking projects – Europe and UK The new HESA/JISC Benchmarking InfoKit :

darius
Download Presentation

Benchmarking in HE: the HESA benchmarking project Graham Fice Consultant

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Benchmarking in HE: the HESA benchmarking project Graham Fice Consultant

  2. Outline of presentation • Background and project deliverables: • HEFCE funded • Planning and data focus • Other benchmarking projects – Europe and UK • The new HESA/JISC Benchmarking InfoKit: • What is benchmarking and why do it • Steps in the benchmarking process • Data • Maturity model for benchmarking • International benchmarking • Future: • Growing use of benchmarking • Heidi development • ‘Redesigning the HE data and information landscape’ project – a pathway to reform on governance and data

  3. Project phase 1 • A rapid assimilation of benchmarking within the HE sector, funded through University Modernisation Fund • Objectives – produce: • An inventory of activities • A mapping of relevant data resources • Selected case studies to illustrate cross-section of activities including benefits • Recommendations for further action to improve benchmarking • Undertaken through interviews with HEIs and sector bodies, online survey of planning community, survey of existing literature • Report and recommendations welcomed by HEFCE – report on HESA website

  4. Project phase 2: objectives • Promote benchmarking as a valuable technique to senior managers • Establish business requirements for benchmarking: • Develop and promulgate models of good practice • Draw together and enhance the available information on benchmarking activities, techniques, benefits and networks: • Promote greater understanding of existing techniques.  Provide resources and training on techniques to HE staff. • Establish a business-focused map of information sources to support benchmarking: • Identify business activities that are poorly served by available data • Find ways to overcome access barriers to data resources. • Enhance HEIDI as a benchmarking tool and data resource • Learn from best practice within other sectors • Develop models for international HE benchmarking

  5. Deliverables • Six focused events plus presentations to conferences, interest groups and individual institutions by invitation • Included input from private providers • Key link with JISC Environment Scanning and BI workstreams: • http://www.jiscinfonet.ac.uk/infokits/strategy/environment-scanning • Newly-launched JISC InfoKit sits within workstreams: • http://www.jiscinfonet.ac.uk/infokits/strategy/environment-scanning/benchmarking/index_html • Two reports (on website): • International benchmarking in UK HE (PA Consulting) (October 2011) • Benchmarking to improve efficiency (November 2010) • Governance through UUK Efficiency and Modernisation Task Force (the Diamond group)

  6. European benchmarking project • Workshops and conferences • Handbook and practical guide • http://www.education-benchmarking.org/ • Presentation/update on Benchmarking website • EU projects reviewed by PA: Indicators for Mapping and Profiling Internationalisation (IMPI) , U Map/U Multirank , EUMIDA feasibility study for creating a European University data collection

  7. AMHEC Benchmarking project • Presentation on Benchmarking website • http://www.benchmarkinginhe.co.uk/

  8. JISC/HESA Benchmarking InfoKit • http://www.jiscinfonet.ac.uk/infokits/strategy/environment-scanning/benchmarking/index_htm

  9. Benchmarks and benchmarking • Benchmarks are measurements used for comparison • Benchmarking is the process of finding good practices and of learning from others • ‘A way of not only doing the same things better but of discovering new, better and smarter ways of doing things and in the process of discovery, understanding why they are better or smarter’ (Gallagher) • Benchmarking is also used to demonstrate accountability to stakeholders and to support academic quality

  10. Vice-Chancellor and Head of Planning - quotes • The HESA P Is and performance against benchmarks are included in suite of performance information shared routinely with Governors and indicators help to provide assurance to stakeholders including the public and policy makers • Benchmarks and relative performance a good and objective way of ensuring the University is on track with its plans • The aim of benchmarking is to place performance in perspective against the sector or group of institutions. • Identifying institutions with a high level of performance can identify good practice. By analysing, assessing and implementing examples based on good practice, institutions can achieve more efficient processes and set higher levels of performance • Sensible benchmarking can lead to realistic target setting in relation to a broad spectrum of performance indicators

  11. Why benchmark (from European project) • Self-assess performance • Better understand the processes which support strategy formulation and implementation in increasingly competitive environments • Measure against and compare with other institutions or organisations, and assess the reasons for any differences • Encourage discovery of new ideas through a strategic look (inside or outside the institution) • Obtain data to support decision-making • Set effective targets for improvement • Strengthen institutional identity, strategy formulation and implementation • Enhance reputation • Respond to national (or international) performance indicators and benchmarks • Set new standards for the institution and sector

  12. Benchmarking not • A ‘cookbook’ from which elements can be selected, but an integrated and integral part of strategic management • A measurement mechanism without a process of discovery and learning • The presentation of data without action on the data • A mechanism for resource reduction although resources may subsequently be redeployed in a more effective way to increase institutional performance

  13. Key success factors from project and literature: institutional and process

  14. Types of benchmarking • Implicit (biproduct of information gathering)or explicit (deliberate and systematic) • Conducted as an independent (without partners) or a collaborative (partnership) exercise • Confined to a single organisation (internal exercise), or involving other similar or dissimilar organisations (external exercise) • Focusedon an entire process (vertical benchmarking) or part of a process as it manifests itself across different functional units (horizontal benchmarking) • Focused on inputs, process or outputs (or a combination of these); • Based on quantitative and/or qualitative information • Norman Jackson and Helen Lund Benchmarking for Higher Education SRHE/OU Buckingham (2000)

  15. Time/effort/characteristics of different types of benchmarking

  16. Metric vs process benchmarking (focus of InfoKit)

  17. Strategy contingent approach

  18. Steps 1 and 2: Set objectives and investigate context – link strategic planning

  19. Step 3: Research target • Key prompts: • What should we measure? • How should we measure: quantitative or qualitative data? • What data/ information are available? • Are the data of sufficient quality and coverage to support our intended measurement? • Formal > informal process

  20. Step 4: Gather data/information • Key prompts: • Gather data/ information • Select comparator groups • Decide on periodicity of measurement

  21. Selection of comparators • Range of comparators 5>50 (Sheffield 27 = Russell Group) • Smaller and specialist institutions - relative lack of comparators or comparators well known • Various factors used • Different comparators depending on type/purpose of benchmarking – not always Mission group • Change over time • Careful selection

  22. A data rich environment - key • Checklist: • Relevance • Timeliness • Stability over time • Quality • Comparability • Many sources inc HESA: • P Is • heidi • International sources already available and new hosted resource launching soon • JISC listing in Environment scanning/BI InfoKit • http://www.jiscinfonet.ac.uk/infokits/strategy/environment-scanning/supply

  23. Step 5 and 6: Measure, present and evaluate – key prompts • Use measurements and benchmarks to evaluate performance: • National P Is – adjusted sector average (subject, qualification/age on entry) plus some location • But what do the figures say? • Presentation – for appropriate audience (Infokitdetail) • Determine gaps and/or establish process differences • Set targets and monitor

  24. Step 7: Manage improvement and change – time/effort

  25. Step 8: Review strategic objectives • Benchmarking not a one off: • Benchmarking becomes an intrinsic part of continuous development, refinement and implementation of strategy • In a rapidly changing environment all aspects of the process intended to effect improvement must be regularly reviewed • ‘Benchmarking is not a black-box technology. The success of a benchmarking exercise ultimately comes down to the capability of managers to use that information to better understand their institutional situation and produce an agenda for strategic change.’ • European Benchmarking Project

  26. Benchmarking maturity model • Different: • Start points for institutions (and within institutions) • Priorities and capabilities • Model sets out levels 1-3 against: • Leadership and governance and Alignment with corporate strategy • Resources deployed • Comparators – diversity, update, challenge/aspiration • Use – simple> sophisticated • Technology – spreadsheet> integration in BI • Data sources • Implementation reflects institutional requirements as well as capability – level 3 not attainable or even desirable for some

  27. International benchmarking • European project http://www.education-benchmarking.org/ • Dedicated project event – presentations on HESA website • PA Consulting report ‘International benchmarking in UK HE’ (October 2011) – on HESA website • Includes reviews of: • ACU University Management Benchmarking Programme • EU projects: Indicators for Mapping and Profiling Internationalisation (IMPI) , U Map/U Multirank , EUMIDA feasibility study for creating a European University data collection • International resources online at end of PA report

  28. International resources – in PA report/to be part of new data sources • Data sources for: • Whole institution comparisons and rankings • Cross-country comparisons of institutional performance in specific areas • Narrative comparisons of process and/or policy approaches • Information on national market characteristics • Intelligence reports on national market developments • Selected benchmarking resources in America, Canada and Australia • European national data providers • http://benchmarking.hesa.ac.uk/wp-content/uploads/2011/10/HESA_International_Benchmarking_report.pdf

  29. Business drivers for international benchmarking Benchmarking to: • Support universities’ internationalisation strategies and plans • Understand relative strengths, weaknesses, opportunities and threats • Inform decisions about objectives, priorities and targets • Manage international performance, through relevant KPIs:

  30. Experiences from UK institutions • Widely differing levels of maturity in internationalisation and hence international benchmarking activity: • Ambivalence over international league tables • Lack of contextual knowledge or trust in data – ACU programme looks at process in context http://www.acu.ac.uk/member_services/benchmarking_programme/benchmarking_methodology • Only a few trusted sources eg Thompson Reuters research and International Student Barometer • Information gaps eg employability, teaching quality • Consequent focus on intra-national comparators

  31. International data - caveats • Issues: • Comparability eg level, mode of study • Comprehensiveness • Timeliness eg US 5 yearly collection • Methodologies • ‘HESA is one of relatively few national agencies collecting and publishing timely and reliable national data on institutional performance.’ • ‘In the UK we are accustomed to having recent data about our HE institutions but this is not universally the case.’

  32. Conclusions • International benchmarking is an evolving science and art for UK higher education • Maturity model • Priorities for institutions: • Extent of international perspective > tailoring approaches to business priorities • Identifying relevant, reliable information sources • Collaborative development of better resources and sharing learning

  33. Heidi – future enhancements • DLHE re-designed data • Federated user accounts • An Application Programming Interface (API) for institution BI system integration • Further user interface and performance improvements • Introduction of user preferences such as selecting a default institution to be highlighted within reports and ability to select number of recent reports displayed on home page • Customisation of the ‘manage reports’ screen • Further enhancements to improve heidi as a benchmarking tool eg statistical benchmarking • http://www.heidi.ac.uk/

  34. BIS White Paper: ‘Students at the heart of the system’ • Para 6.22 - a new system that: • Meets the needs of a wider group of users • Reduces duplication (‘burden’) • Results in timelier and more relevant data • Redesigning the HE data and information landscape • Report now published after extensive consultation • http://landscape.hesa.ac.uk/

  35. HEBRG survey of data collection

  36. Model for governance and standards in the new landscape

  37. Benchmarking - future • Increasing use: • Assessing the student experience, market opportunities and portfolios • Looking forward in time • Looking even more outside the institution • Knowing the institution and ‘business’ in richer ways • Using data as part of a richer Business Intelligence resource

  38. HESA Benchmarking projecthttp://benchmarking.hesa.ac.uk Graham Fice graham.fice@hesa.ac.uk

More Related