1 / 36

Extracting useful information from the UK’s National Student (Satisfaction) Survey

Extracting useful information from the UK’s National Student (Satisfaction) Survey. Mark Langan, Alan Fielding and Peter Dunleavy Manchester Metropolitan University Faculty of Science & Engineering. What makes a student satisfied?.

Download Presentation

Extracting useful information from the UK’s National Student (Satisfaction) Survey

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Extracting useful information from the UK’s National Student (Satisfaction) Survey Mark Langan, Alan Fielding and Peter Dunleavy Manchester Metropolitan University Faculty of Science & Engineering Higher Education Academy Annual Conference 2010

  2. What makes a student satisfied? Higher Education Academy Annual Conference 2010

  3. Research evidence based on a range of quantitative approaches using national dataset for science subjects • Discussion about implications of using NSS for decision-making in H.E. Structure Higher Education Academy Annual Conference 2010

  4. Compulsory process in UK (since 2005/6), conducted by Ipsos MORI on behalf of HEFCE. Uses a standard basic survey to monitor perceptions of final year students. Approach based upon an Australian survey called the Course Experience Questionnaire (CEQ; Ramsden 1991). Considered robust in terms of three statistical measures; internal consistency, construct validity and concurrent validity. Measures six dimensions: teaching; assessment and feedback (sometimes considered separately); academic support; organisation and management; resources; and, personal development. Higher Education Academy Annual Conference 2010

  5. The NSS uses a 5 point scale, completed in the final UG year (usually online) survey. In addition to 21 ‘items’ there is a separate overall satisfaction rating (Q22). Thorough overview can be found in Surridge 2007 and Marsh and Cheng 2008. Take home message: the outputs are hierarchical in nature and not designed for simplistic league tables. Note: satisfaction is a complex concept to measure and there are many approaches. Higher Education Academy Annual Conference 2010

  6. With particular reference to… • consistency of patterns between years • differences between subjects • factors associated with overall student satisfaction. Higher Education Academy Annual Conference 2010

  7. Data Level 3 (closest to Programme/Dept)NSS data from • 2007; • 2008; • 2009. Pruned to remove subjects not taught at MMU (e.g. medicine). Still very large data sets (>40,000 cases per survey) Higher Education Academy Annual Conference 2010

  8. NSS Questions Teaching (Teach) Q1 Staff are good at explaining things. Q2 Staff have made the subject interesting. Q3 Staff are enthusiastic about what they are teaching. Q4 The course is intellectually stimulating. Assessment fairness (Fairness) Q5 The criteria used in marking have been clear in advance. Q6 Assessment arrangements and marking have been fair. Assessment feedback (Feedback) Q7 Feedback on my work has been prompt. Q8 I have received detailed comments on my work. Q9 Feedback has helped me clarify things I did not understand. Higher Education Academy Annual Conference 2010

  9. NSS Questions Support Q10 I have received sufficient advice and support with my studies. Q11 I have been able to contact staff when I needed to. Q12 Good advice was available when I needed to make study choices. Organisation (Org) Q13 The timetable works efficiently as far as my activities are concerned. Q14 Any changes in the course or teaching have been communicated effectively. Q15 The course is well organised and is running smoothly.

  10. NSS Questions Learning Resources (Resources) Q16 The library resources and services are good enough for my needs. Q17 I have been able to access general IT resources when I needed to. Q18 I have been able to access specialised equipment, facilities or room when I needed to. Personal Development (PD) Q19 The course has helped me present myself with confidence. Q20 My communication skills have improved. Q21 As a result of the course, I feel confident in tackling unfamiliar problems. Overall satisfaction (Overall) Q22 Overall, I am satisfied with the quality of the course. Higher Education Academy Annual Conference 2010

  11. Which of the areas surveyed do you think correlate with the Q22 overall satisfaction score? • What do you think student perceptions of the questions are (i.e. what is going through their mind when they complete the questionnaire)?

  12. Satisfaction is % of students answering 4 or 5 to a question. e.g. Q 1 Biology MMU – 95% of students were satisfied Satisfaction Higher Education Academy Annual Conference 2010

  13. There are subject differences Subject differences confound simple comparisons, examples from 2009. Medians for Qs 7, 8 & 9 plus 13, 14 & 15. Higher Education Academy Annual Conference 2010

  14. Biology results (2008)

  15. Higher Education Academy Annual Conference 2010

  16. What answers are correlated with Q22? Approach • Use % in agreement with a question (answers 4 & 5 on 5 point scale). • Simple correlation (ignoring subject) • Correlation allowing for subject differences (ANCOVA) • Repeat for each year. • Calculate nationally and within MMU Higher Education Academy Annual Conference 2010

  17. Annual national trends Overall satisfaction is consistently related to: Teaching Quality, Support and Organisation. It only weakly related to Resources and Assessment, particularly feedback. Higher Education Academy Annual Conference 2010

  18. Subject differences (Feedback Qs) Higher Education Academy Annual Conference 2010

  19. Cluster analyses are unsupervised methods that take no account of pre-assigned class labels or values. • Decision and regression trees use a supervised learning algorithm which must be provided with a training set that contains cases with class labels or values. • We used a new variant of regression trees called ‘RandomForests’. Robust method with fewer constraints than traditional regression methods, for example allowing different factors to be explored in their influence on overall satisfaction within different subgroups. Predictive model (Forest Tree Analysis) Higher Education Academy Annual Conference 2010

  20. Regression Trees (an example)based on http://www.dtreg.com/classregress.htmPredicts property value

  21. Effectiveness of Q1-21 to predict overall satisfaction (Q22) Higher Education Academy Annual Conference 2010

  22. Predictive model (Forest Tree Analysis) Higher Education Academy Annual Conference 2010

  23. Actual Predicted Residual SE1 SE2 SE3 Subjects Q22 ‘under-performers’ Higher Education Academy Annual Conference 2010

  24. Q22 ‘as expected from Q1-Q21’ Higher Education Academy Annual Conference 2010

  25. Q22 ‘over-performers’ Higher Education Academy Annual Conference 2010

  26. University Groupings

  27. Mean overall Q22 for university groups

  28. Subject differences (e.g. mathematical content) • Institutional differences • False assumptions (e.g. enhancing feedback directly enhances Q22) • Institutional effects • Satisfaction is a complex measure related to L&T practices Conclusions Higher Education Academy Annual Conference 2010

  29. 2009 Top five predictors (best first) Q15 The course is well organised and is running smoothly. Q4 The course is intellectually stimulating. Q1 Staff are good at explaining things. Q21 As a result of the course, I feel confident in tackling unfamiliar problems. Q10 I have received sufficient advice and support with my studies. Higher Education Academy Annual Conference 2010

  30. “… [The NSS] is not a measure of satisfaction so much as a window into how our designs for learning are experienced by students. From these insights we assemble the practical measures we may take to enhance the quality of their experiences.” Quotes from Ramsden (2007) Higher Education Academy Annual Conference 2010

  31. “... it is not simple to know what to do. Current experiences, unlike satisfaction, are a mixture of previous experiences and the environment as it is now... ... so sometimes we will need to adjust expectations or consider altering previous experiences in order to improve quality.” Quotes from Ramsden (2007) Higher Education Academy Annual Conference 2010

  32. “I cannot agree with the idea, for example, that because students are slightly less positive about feedback on assessed work in the NSS than about the quality of teaching... ... we should rush to bully academics into providing more feedback more quickly.” Quotes from Ramsden (2007) Higher Education Academy Annual Conference 2010

  33. “From this it also follows that students do not have a ‘right’ to be satisfied. They are themselves part of the experience. .. ... Students decide their own destinies and we can only add or subtract value at the margins.” Quotes from Ramsden (2007) Higher Education Academy Annual Conference 2010

  34. Does anyone have an example of direct change as a result of NSS surveys? How can we use NSS ratings to enhance our practices? Higher Education Academy Annual Conference 2010

More Related