1 / 16

Extreme measures

Extreme measures. Professor Sandra Harding James Cook University Acknowledgements: Dr Nick Szorenyi-Reischl, Floris van der Leest and Jasper Taylor, JCU. People in labour force with University qualifications 2006. Good reasons to measure. Public accountability

garry
Download Presentation

Extreme measures

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Extreme measures Professor Sandra Harding James Cook University Acknowledgements: Dr Nick Szorenyi-Reischl, Floris van der Leest and Jasper Taylor, JCU

  2. People in labour force with University qualifications 2006

  3. Good reasons to measure • Public accountability • Providing information to marketplace • Driving particular outcomes: • Employability • Student satisfaction • Learning outcomes • Retention • Engagement • Equity and Indigenous education • Generic skills • Research output: quality, quantity, impact, translation • Cross national comparison and competitiveness • Improvement: AUQA, group, sector level…

  4. Renewed interest in measures.. • Australia • Reviews – Bradley, Cutler, CRC, RQF->ERA • Compacts • AUQA • International standing queries • Abroad • Bologna process – comparable, transportable • OECD: measuring teaching cross-nationally • US: their competitiveness • UK: improving measures, including in upskilling • League tables and their impacts

  5. Getting the measure…..

  6. Institutional level • DEEWR reporting – IAF institutional level • Organisational sustainability • Quality of outcomes • Compliance with legislation • Monitoring equity • Indigenous educational outcomes • Student evaluation of teaching • Student evaluation of subjects • Sundry other surveys

  7. Sector level – data gathered at level of the institution Teaching and learning: • Graduate Destination Survey • Course comment • Labour market status • Employment details • Current further study • Course Experience Questionnaire • Good teaching • Clear goals and standards • Appropriate workload • Appropriate assessment • Generic skills

  8. Sector level….. • AUSSE: Australian Universities Survey of Student Engagement (25 universities in 2007), ACER sponsored • First year experience questionnaire (since 1994) • Demographic data • FY student experience • Universities Australia • Statistical data collection from members;

  9. Sector level… • Various other benchmarking exercises; and across groups of institutions • Research measures…. Higher Education Research Data Collection (HERDC) • Research income – Australian Competitive Grants (and others) • Publications (quality outlets, referreed) • PhD completions • Student load

  10. Excellence for Research in Australia (ERA) • 141 Fields of Research groupings (4-digit FOR codes ANZSRC; 8 clusters) • Three broad categories of indicators: • Measures of activity and intensity (total research income and total number of outputs) • Indicators of research quality (publications over 6 years; competitive research income over 3 years; other discipline specific measures) • Indicators of excellent applied research and translation of research outcomes (no detail as yet)

  11. Internationally… • OECD data: spend on higher education • World ranking of Universities - league tables: • Shanghai Jiao Tong Academic Ranking of World Universities • Times Higher Education Supplement– QS World University Rankings

  12. Fragility • Data used for purposes other than originally intended • CEQ and GDS being used as a proxy for quality and to distribute funding: • Teaching and Learning Performance Fund • Good Universities Guide • Research data uses: distribute block funding (IGS, RIDC, RTS, APAs, CTSs) • Low response rates • Reliability and validity problems • Data lagged • Disconnect between institutionally gathered data on satisfaction and CEQ outcomes

  13. Hot topics • LTPF (Scanlon, James): • Allegations of institutional bad behaviour • Reporting on whole of institution, not field of study • Can students report? • Large impact of small differences between institutions • Dubious measure of institutional performance • Self reported generic skills a poor indicator • CEQ tells little about learning in absolute or value add terms

  14. Hot topics … • SJTU and THES League tables (Marginson) • Both have problems, • The THES ranking particularly – volatile; opinion based; not reproducible • Shanghai Jiao Tong: underplays humanities and social science and publishing outside of English language journals, history is important

  15. Hot topics … • Measurement driving loss of diversity? • Risk of losing interdisciplinary research: a challenge for ERA • International developments – Bologna impacts • Measuring outcomes for student learning and value add: • Assessment tasks: uncalibrated, unknown reliability and validity • Standards elusive – need for focus on value add outcomes to deal with sector diversity, measurement based on outcomes • Measuring international students’ English literacy – IELTS predictive validity unknown

  16. What do we need? • Better, targeted measures – fewer? Wonder… • Measures being used appropriately • Measures that drive the right behaviour What don’t we have? Good measures of: • Impact of research; • Standard of English • Standard of learning • Diversity • Interdisciplinarity • The value added through education • Industry outcomes/alumni outcomes

More Related