1 / 129

HESA for Planners

HESA for Planners. Objectives. Identify best practice around quality assurance and use of data Improve our understanding of check documentation and how it can be utilised Introduce the downloadable files and how they can be used Outline the future information landscape

jada
Download Presentation

HESA for Planners

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. HESA for Planners

  2. Objectives • Identify best practice around quality assurance and use of data • Improve our understanding of check documentation and how it can be utilised • Introduce the downloadable files and how they can be used • Outline the future information landscape • Better understand the IRIS outputs • Learn from each other

  3. “You never finish HESA, you abandon it” Utilisation of time – ‘opportunity cost’ Efficient and cost-effective procedures Best practice Collaborative approach to data Resource Systems that work for the organisation

  4. How to be good… • Data ownership: • Systems (storage issues) • People • Translation: • From HEIs internal data language to an external data language • The extent to which these match • The variety of external languages that an HEI has to work with • Documentation • How, who, when • Education • Value of data and transparency

  5. Evidence (or anecdote) from the KIS • New requirement – high profile • Data spread across institutions • No documentation • Little/no control • No standardisation/comparability • Variable quality • Variable approaches to storage • …being assembled and managed in spreadsheets

  6. Spreadsheets • Often created by people who don’t understand principles of sound data management • Conflate data and algorithms • Almost impossible to QA • Spread and mutate like a virus Search “Ray Panko spreadsheets”

  7. The institutional perspective

  8. A Planning Perspective on HESA Returns Fidelma Hannah, Director of Planning Loughborough University

  9. Overview Responsibility for completing HESA returns lies with relevant sections of the University but Planning has the role of: • co-ordinating the returns • ensuring appropriate governance, data assurance and consistency between all HESA returns • disseminating HESA data across the University

  10. Co-ordination • The Planning Office produces a schedule of Statutory Returns listing all HESA, HEFCE and other Funding Agency returns, identifying: • Submission dates • Ways in which data is used • Process for completion • Independent checking and sign-off process • The Planning & Finance Offices are accountable to the Vice-Chancellor and Audit Committee for the verification and accuracy of the data returns. • The Planning Office liaises with all relevant sections of the University to ensure that returns are completed, checked and signed off in accordance with the schedule.

  11. Co-ordination, contd. Planning is: • Involved most directly with preparation and checking of HESA student return BUT • Has a significant and increasing input into the processes used for other HESA returns

  12. Responsibility for Completion of HESA Returns • Student Return – Student Office, Academic Registry • Staff Return – Human Resources • Finance Return – Finance Office • HEBCI – Enterprise Office/Planning Office • Destination of Leavers - Careers • Estates Management Statistics – Facilities Management • Institutional Return – Planning Office

  13. Other Student - Related Returns • HESES • TRAC • OFFA • Teaching Agency • Skills Funding Agency • Education Funding Agency • REF All of these returns incorporate HESA data

  14. Governance and the role of Audit Committee • The University’s Audit Committee must provide assurance about the management and quality assurance of data provided to HEFCE, the Higher Education Statistics Agency (HESA) and other public bodies. • This is a requirement of the HEFCE Financial Memorandum and Accountability and Audit Code of Practice introduced on 1 August 2008. • Audit Committee reviews the schedule of statutory returns annually and also receives regular reports from internal and HEFCE auditors on the various returns.

  15. Data Assurance – in year Planning: • Liaises closely with Student Office during preparation of HESES return as this helps to ensure data quality in year • Co-ordinates monthly Data Management Group meetings • Membership : IT Services, Planning, Student Office, Research Student Office, Careers and Admissions • Reviews the funding and monitoring data produced by HEFCE after HESA return has been submitted

  16. Data Assurance during HESA preparation Planning: • Maintains regular contact with Student Office during preparation of HESA return • Uses the HEFCE recreation files extensively to check data quality before HESA student return is finally committed (This includes detailed examination of individualised student files) • Undertakes a comprehensive review of check documentation at commit stage with cross-checking by Finance Office • Retains comprehensive records and an audit trail of the checking processes • Joins the briefing meeting with VC before sign-off

  17. Consistency across HESA Returns • Vital to ensure that data is consistent across HESA Staff, Student and Finance returns because data will be combined • Important to align JACS, Cost Centres and UOAs • Implications for subject mapping must be considered • Implications for funding must be considered ,e.g. JACS codes and cost centres both used to determine additional funding for very high cost subjects • Added complexity of Key Information Sets

  18. Disseminating Bench-marking Data and Comparisons Use of HEIDI to generate bench-marking data at subject level including: • Student: Staff Ratios • NSS • Employability • Degree Classifications • International/UK/EU students • Completion rates Production of institutional profile data such as: • Student profile • Income & Expenditure profile • Cost Centre profile

  19. Final Comments • Understanding HESA data is becoming even more critical in current HE environment • Ensuring the accuracy of HESA data is important for future funding streams (SNC monitoring, additional funding for high-cost subjects, WP indicators, etc.) • Effort invested will make future income streams more reliable, avoiding claw-back in later years. BUT • Complexity and cross-checking is increasing demands on Universities.

  20. HESA – Living and Learning Becs Lambert Senior Assistant Registrar Strategic Planning and Analytics University of Warwick

  21. Outline • The Warwick context • Warwick’s HESA process • Sign-off, Verification and Quality Assurance • Issues • Positives • Challenges moving forward • Using HESA data – the HEIDI API

  22. My context… + + = + …baptism of fire

  23. Warwick context…

  24. The HESA process • Warwick SITS update schedule (positives and negatives) • Prep and housekeeping from April – address learning points from previous year, implement procedures for HESA changes, check ‘usual suspects’ • Strong use of validation kit to identify issues • Use of internal access databases to cross-check HESA return data and ensure comprehensive data checks • Aim for early as possible submit/commit schedule to front load schema and business validation issues

  25. Sign-off, verification and data quality

  26. Issues

  27. Strengths

  28. Challenges moving forward

  29. Using HESA data – the HEIDI API • Before… • Flexible report writing with drag and drop interface for usability but can be slow to build large reports • Direct output to Excel or XML file • Limited to 125 columns for extracts (eg. Finance Table 5b has 490 columns of data times 3 years = 1470 columns = 12 separate extract files) • People like cross tabular reports, but data warehouses need flat data files so the extracts need to be transformed prior to loading • Our data transformation was based on a VBA script in Excel, but needed to be customised for each extract (different numbers and column groupings) • Depending on the extract size many files may need to be processed and concatenated together

  30. HEIDI > Data Warehouse • Turning this: • into this: • is relatively slow and painful!

  31. HEIDI API > Data warehouse • API permits rapid extraction of large volumes of data in warehouse-friendly format • Based on standard web services technology • Difficult to use and requires specialist technical skills but very powerful and fast • Generate a custom url to produce a response (eg. • https://heidi.hesa.ac.uk/api/1.0/datareport?rowtype=3297&year=61422&domain=3311&valuetype=4008&field=61432 • produces a report of UCAS Accepted Applicants for 2011/12 by Institution and Gender) • Extracts produced as single files containing data and metadata (field descriptions) • Simple direct loading into warehouse • XML shredded (transformed) into data tables using the native query language capabilities

  32. HEIDI API > Data warehouse • Lessons Learnt: • API is not a “magic bullet” but is a useful additional tool • Harvesting HESA data for BI analysis now down from days to hours, but specialist skills and knowledge still required • Current API needs simplifying and extending to allow multi-year and multi-value extracts • Next steps for Warwick: • Use of the API still requires a number of steps – plan is to more tightly integrate the extract and loading of data using SQL Server Integration Services • Provision of standardised self-service reporting capability for power users to extract and analyse HESA data contained in the warehouse

  33. Discussion • Discuss the following as a table: • How good are you at HESA? • Consider factors such as data ownership, documentation, staffing, knowledge, resilience, training, systems, data quality process – how extensive and sophisticated it is. How often you use HESA data and what for and how is the process and data managed/structured internally. What are the barriers and how do you overcome them? • Now consider and rate your own institution: 1st class 2:1 2:2 3rd Unclassified

  34. Using check documentation

  35. What is check documentation? An Excel workbook which displays the data in a series of tables Used by analysts at HESA for quality assurance Available after any successful test or full commit

  36. Why should I use it? • Check documentation gives an overview of the submitted data which can help identify potential issues • Provides context to the queries raised by HESA • The institution will be able to spot anomalies that HESA would not • Comparison feature also useful for later commits/test commits to monitor changes Check doc is one of many reports and is best used in conjunction with other reports

  37. Task • In your groups, or individually, complete check documentation tasks 1-4

  38. How can check doc be used? • Use the check documentation guide produced by HESAas a starting point • Many of the items provide year on year comparisons:

  39. Using check documentation Different populations and groupings are used for each item in the check documentation, including derived fields For 2012/13 the definitions sheet has moved to the coding manual

  40. Who are those 5 students?? • To get the most out of check documentation and work out whether something is an error, you need to identify the records behind the table • To do this you can use Data Supply which contains much of the raw data submitted alongside the derived fields used by HESA • Pivot tables can be used to recreate items and identify particular cells

  41. Identifying students: • The HESA for Planners manual contains instructions on recreating the populations and conditions used in check documentation • As an example we will recreate item 6a ‘Student cohort analysis’….

  42. Check doc changes for 2012/13 • Revised tolerances • Items 1, 2 & 3 will now highlight year on year changes of +/- 10%/50 students • Item 11 will look at sector averages rather than just the previous year • Move to JACS3 and new cost centre coding frame • New Fees tab • More detailed breakdowns, summations and percentage changes added to enable checking

  43. Item 2a - Qualifications awarded What is the difference between 2 and 2a?

  44. Item 7 – Highest qualifications on entry • Now split into 7a & 7b ‘proportion of highest qualification on entry for first years’ • Subtotals also added to item 7a

  45. Item 12 – average instance FTE • This item has been broken down further to provide a three way split of starters, leavers and ‘others’. • The different groups may have very different FTE values that impact the average

  46. Other reports

  47. Minerva …is the data query database operated by HESA • During data collection HESA (and HEFCE) raise queries through Minerva and institutions answer them • These responses are then reviewed and stored for future use by HESA and the institution

More Related