1 / 76

HESA for Planners

HESA for Planners. Objectives. Identify best practice around quality assurance and use of data Improve our understanding of check documentation and how it can be utilised Introduce the downloadable files and how they can be used Outline the future information landscape Learn from each other.

kirby
Download Presentation

HESA for Planners

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. HESA for Planners

  2. Objectives • Identify best practice around quality assurance and use of data • Improve our understanding of check documentation and how it can be utilised • Introduce the downloadable files and how they can be used • Outline the future information landscape • Learn from each other

  3. “You never finish HESA, you abandon it” Utilisation of time – ‘opportunity cost’ Efficient and cost-effective procedures Best practice Collaborative approach to data Resource Systems that work for the organisation

  4. The institutional perspective

  5. HESA Student Return Paul Cotter UWE 18th September 2012

  6. Topics for today… Background The HESA Team Data Checking Process – Past Data Checking Process – Present + Future? Managing HESA “High Season” Headlines + Conclusions

  7. HESA Student Return historically administered within a traditional Academic Registry service • Following a series of restructures, HESA responsibility moved to the Business Intelligence and Planning function at UWE which was later subsumed within a larger “super service” • Major initiatives ongoing designed to: • Make best use of data checking opportunities throughout the academic year • Minimise impact of workload peak during the HESA “high season” (August to October) • To deliver high quality HESA student return without the stress, anxiety and excessive hours traditionally associated with this activity! Background

  8. Vice Chancellor (Formal Sign-Off) • Head of Planning (Oversight/Scrutiny, Local Sign-Off) • External Reporting Manager (Operational Ownership, HESA Expert) • External Reporting (Planning) Assistants x2 HESA Student Return Team

  9. Go back 5+ years and “early” data checking efforts were undertaken by one Academic Registry staff member (Limited effectiveness, no shared ownership and no shared best practice, huge workload “peak” between August and October) • More recently, data quality activity prior to August focused on locally ‘policed’ data checking on the basis of UWE-wide “data calendar” and associated guidance • HESA team leader would chair a group of ‘data managers’ from across the university to enforce standard practice in terms of academic infrastructure and student record keeping • This process was inclusive (UWE-wide) and in keeping with a devolved administrative structure, it could not guarantee conformity and left a significant volume of amendments to be processed post August 1st (Ineffective, too devolved… Insufficient resources to ‘police’ 30,000 student records and variation in practice) Data Checking Processes - Past

  10. Major operational restructure leading to centralisation of administrative activities, efficiency savings required efficient solutions • Ownership of student record quality with centralised Student Records team, data checking activity maintained and monitored by the HESA team • Standardized, automated data checking reporting systems: • Backed up by “signed off” data checking process and clear/up-to-date online documentation accessible by all stakeholders • Policy of continuous data quality monitoring to “flatten” workload peaks • Proactively optimise data accuracy by using ‘beta’ systems prior to 1st August • “Service level agreement” required between Student Records and HESA teams to clarify responsibilities, deadlines, etc Data Checking Processes – Present + Future

  11. Documentation of the entire HESA process • Test documentation and processes through internal audit to legitimize approach • Attend HESA briefings and pay attention to communications to ensure that all system developments for future returns are dealt with asap • Hit the button on 1st August with the most accurate, conforming raw student record dataset possible • Work through your pre-agreed task list as systematically as possible • Make use of reconciliation tools to scrutinize accuracy and consistency Managing HESA “High Season”

  12. Proactive investment in data checking processes/tools and co-ordination of activity will pay dividends • Viewing HESA as a long term ‘project’ will deliver short and long term benefits: • Increased confidence in underlying student record quality • Minimising “high season” stress • Clear audit trail for future reference Headlines + Conclusions

  13. HESA for Planners, 18 September 2012 The HESA Integration Group Dr Nicky Kemp Director of Policy and Planning University of Bath

  14. Role of Policy and Planning • Accountable to Council for risk management process • Accountable to Audit Committee for data verification process • Accountable to Vice-Chancellor for the accuracy of data returns • Accountable to colleagues for guidance and interpretation

  15. HESA Integration Group Who’s who? • Policy & Planning • Finance • Human Resources • Student Records • Estates • Computing Services

  16. HESA Integration Group Responsibilities: • OPP: Institutional Profile, KIS, HEBCIS • Finance: HESA FSR; TRAC • HR: HESA Staff Return • SREO: HESA Student Record (NSS, DLHE); HESES • Estates: Estates Management Statistics

  17. What we do

  18. HESA Integration Group Activities: • Co-ordinate annual returns • Plan future returns • Consider implications for related activity • Undertake horizon scanning

  19. Data quality and use • Consistency • Funding & regulation • Visibility

  20. Verification • Consistent, documented verification process • Provision for internal ‘external’ scrutiny • Timely preparation of data submissions and their scrutiny • Credibility checks of data

  21. Using check documentation

  22. What is check documentation? • Check doc is an Excel workbook which displays the raw data in a series of tables • Key tool for quality assurance used by HESA and the HEI • Produced after a successful commit/test commit One of many tools for QA – should not be used in isolation!

  23. Why use check documentation? • Check documentation gives an overview of the submitted data • Comparison feature also useful for later commits/test commits to monitor changes • Identify and explain/resolve anomalies in the data • Cannot rely solely on HESA to check the data

  24. Understanding check documentation • Use the check documentation guide as a starting point • Use last year’s data for comparison:

  25. Using check documentation Check doc contains the definitions and populations needed to recreate an item and identify students

  26. Other commit stage reports HIN report Reports any cases where the year-on-year link has been broken Exception report Reports any commit stage validation errors and warnings which should be reviewed by the HEI

  27. Minerva …is the data query database operated by HESA During data collection HESA (and HEFCE) raise queries through Minerva and institutions answer them These responses are then reviewed and stored for future use by HESA and the institution

  28. Contextual information in Minerva • At the request of the National Planners Group HESA have begun working on a ‘public’ version of Minerva • Designed to give users of the data additional context • In November HESA will publish a query to Minerva to which HEIs can add notes about their institution e.g. ‘we have lots of franchise students’, ‘we recently opened a new department’ • HESA will not interact with what is added • Will remain open throughout the year to add to • HESA will extract and send the information to accompany data requests

  29. Using downloadable files

  30. Downloadable files • Data Supply (Core, subject, cost centre, module and qualifications on entry tables) • NSS inclusion (person and subject) and exclusion files • POPDLHE • TQI/UNISTATS • All available after every successful full and test commit

  31. Using downloadable files • The files should be utilised to: • Carryout additional DQ checks • Benchmarking • Planning/forecasting • Improve efficiency (recreating data from scratch unnecessary) • The files include derived fields and groupings not otherwise found within the data

  32. League tables • Student staff ratios by institution and cost centre • First degree (full-time for Guardian) qualifiers by institution and league table subject group • Average total tariff scores on entry for first year, first degree students by institution and league table subject group. • Data is restricted to tariffable qualifications on entry (QUALENT3 = P41, P42, P46, P47, P50, P51, P53, P62, P63, P64, P65, P68, P80, P91) (Times applies ‘under 21’ restriction, Guardian applies ‘full-time’ restriction) • Full-time, first degree, UK domiciled leavers by Institution, League table subject, Activity • Full-time, first degree, UK domiciled leavers entering employment • Graduate employment/Non graduate employment/Unknown • Positive destinations /Negative destinations • Expenditure on academic departments (Guardian) • Expenditure on academic services (Guardian, Times, CUG) • Expenditure on staff and student facilities (Times, CUG)

  33. There is no smoking gun “Firstly, you need a team with the skills and motivation to succeed. Secondly, you need to understand what you want to achieve. Thirdly, you need to understand where you are now. Then, understand ‘aggregation of marginal gains’.  Put simply….how small improvements in a number of different aspects of what we do can have a huge impact to the overall performance of the team.” Dave Brailsford, Performance Director of British Cycling

  34. There is no smoking gun “Firstly, you need a team with the skills and motivation to succeed. Secondly, you need to understand what you want to achieve. Thirdly, you need to understand where you are now. Then, understand ‘aggregation of marginal gains’.  Put simply….how small improvements in a number of different aspects of what we do can have a huge impact to the overall performance of the team.” Dave Brailsford, Performance Director of British Cycling

  35. Demonstration….

  36. HESA reporting and the new information landscape: the road ahead

  37. (Interim) Regulatory Partnership Group • Established by HEFCE and SLC • Includes: HESA, QAA, OFFA, OIA • Observers: UUK, GuildHE, NUS, UCAS • To advise on and oversee the transition to the new regulatory and funding systems for higher education in England • Projects A and B

  38. D-BIS White Paper • Para 6.22 of White Paper • A new system that: • Meets the needs of a wider group of users • Reduces duplication • Results in timelier and more relevant data • Also work with other government departments • Secure buy-in to reducing the burden

  39. Project B • Re-design the HE data and information landscape • Feasibility and impact analysis • Roadmap for future development • Focus on data about courses, students and graduates • Report to RPG June 2012

  40. Engagement • HEFCE, UCAS, SLC, AHUA, BUFDG, SFC, HEFCW, WG, SROC, UCISA, ARC, Scot-ARC, ISB, OFFA, SAAS, HEBRG, CHRE, QAA, SPA, JISC, UKBA, ISB-NHS, HO, BIS, IA, DS, LRS, AOC, UUK, GuildHE, NPG, SLTN, DH • Workshops, webinars, web site and newsletters

  41. HEBRG survey of data collection

  42. HEBRG survey of data collection • Variety of responses from institutions • Some references to Data Supply Registers.. • …but comparison with other responses suggests these registers are incomplete • Some further research in this area, especially around PSRBs… • …still much to do to get a complete picture at a sector level

  43. What is our aim? Create a sector-wide information database by… … attempting to fit a single, rigid data model onto an HE sector that is diverse and dynamic

  44. Data model

  45. It is all about translation • From HEIs internal data language to an external data language • The extent to which these match • The variety of external languages that an HEI has to work with • A different journey for each institution • Issues of lexicon as well as data definitions…

  46. HE lifecycle

  47. Collection processes (eg HESA) • Annual, retrospective, detailed collection • Core dataset for funding, policy and public information • Demanding quality standards • Submission ratio = submitted records final records

  48. Student Record submission ratio

More Related