1 / 33

Exploring and utilising students' perspectives on feedback: a mixed method, longitudinal approach

Exploring and utilising students' perspectives on feedback: a mixed method, longitudinal approach. Simon Croker, Kara Peterson, Dr Peter Hills and Dr Rachel Manning. Importance of Feedback. Where students make mistakes. Where students can improve. What students have done well.

kenda
Download Presentation

Exploring and utilising students' perspectives on feedback: a mixed method, longitudinal approach

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Exploring and utilising students' perspectives on feedback: a mixed method, longitudinal approach Simon Croker, Kara Peterson, Dr Peter Hills and Dr Rachel Manning

  2. Importance of Feedback • Where students make mistakes. • Where students can improve. • What students have done well.

  3. NSS 2012 Results

  4. Background Literature • Hulme and Forshaw (2009) – form and content of feedback not helpful. • Gibbs and Simpson (2004) – lecturers and feedback. • Higgins, Hartley and Skelton (2001) – differing views on feedback meaning. • Nicol (2010) – what feedback should be. • Duncan (2007) – little improvement.

  5. Project Timeline January 2013 Beginning of Research November –December 2013 Focus Group Phase 2 June – October 2013 Feeding Forward on Perceptions of Feedback February 2013 Pilot Phase January – March 2014 Evaluation of Project March – May 2013 Data Collection Phase 1

  6. Focus Group Participants • Total of 41 participants. • Psychology students across the three undergraduate years. • Range of academic profiles. • £9 or 1.5 credits. • Difficulty in recruiting participants.

  7. Survey Participants • Total of 98 participants • Psychology students from the three undergraduate years and masters. • Range of academic profiles. • £6 or 1 credit.

  8. Transcription and Analysis • Audio recordings were transcribed. • Will be analysed using full thematic analysis. • Currently preliminary thematic ideas are presented here. • Possible issues with current feedback system and suggestions to improve.

  9. Survey Data Figure 1. Satisfaction levels with different aspects of feedback

  10. Convenience - Issues • ‘ I just read the e-mail. I haven’t collected any of mine (laughter). I just use the e-mails and go from there [INT: Yeah]. (2) I think it should be easier to actually collect your essay.’ • ‘No I’m not a fan of the codes. It’s just too many of those, [INT: um hm] I don’t remember them from my head so I always like look back what does B12 mean, oh B12, is doesn’t say here so I don’t even know what B12 means [INT: um hm].’

  11. Convenience - Issues Figure 2. Percentage of students that collect feedback and how they use feedback.

  12. Convenience - Suggestions • Electronic Feedback • ‘Electronic could be easier to access as well cause you’ve got like, I’m not particularly organized and we’ve got ridiculous amounts of paper work and trying to find the last times feedback to try and improve this time is just borderline impossible in my (laughter) lack of organizational skills…’ • Online sign up system for office hours. • Use of VLE could be improved.

  13. Continuing Dialogue - Issues • ‘You basically get the feedback but you can’t, I’m always struggling to understand what it means in regards to my own coursework so yeah, that’s, that’s what I think.’ • ‘I like took that extra kind of time to make sure it was prefect but I still kept getting the same thing and I was just like are they really reading over it or is it, is that just like a generic thing.’

  14. Continuing Dialogue - Issues • ‘Yeah I think it should be a lot more one on one cause like you said some people aren’t very happy with the grades they’ve got and they don’t want it to be you know they might be embarrassed in front of other people talking about it so and I think erm the B1 and all that, I think that’s it’s helpful in a way but at the same time it just tells you what you’ve done wrong not where you can improve and that is a really big aspect that you should have to kind of do better in the future because there are some people who don’t go to see the lecturers erm about their grades and stuff and then they keep making the same mistakes over and over again so [INT: um hm]’ • ‘I don’t know whether it’s my slackness or whatever, but I don’t know who I’m, not allowed to speak to, but like, but I would never realised I could go to my personal tutor because they haven’t marked, well I don’t think they’ve marked it. So I would assume it wouldn’t be their area for me to go speak to them about.’

  15. Continuing Dialogue - Issues Figure 3. Percentage of students and their attitude to feedback.

  16. Continuing Dialogue - Suggestions • Utilising personal tutor system to create a continual dialogue with students. • Electronic storage of feedback could also allow personal tutors to quickly identify students key areas that need improvement and offer support in improving these areas. • Convert the process of feedback into an on going and continual process.

  17. Consistency - Issues • ‘It’s been quite mixed, like you either get three pages full of things that you could’ve done better or you get a little blank box and something and sort of a few little hand annotations on the thing…’ • ‘I think it should be standardised but obviously because of the topic it might vary what comments we’ll get written but I think if there was like a standard procedure for each topic then it might be a bit clearer, okay but overall I think it’s pretty good…’

  18. Consistency - Issues Figure 4. Student preferences for the structure of feedback.

  19. Consistency - Solutions • Bullet points highlighting the things that need improvement as well as the positive aspects of the work. • Annotated comments throughout the work rather than the use of codes. • ‘I think personally I’d want like ermbullet points at the start and then I’d want erm like details throughout, I don’t think I’d want, maybe some codes like about spelling or something like that or I don’t know a paragraphing or something but I really like the detail cause that’s what really helps me understand it and also you have to keep going back to check the different codes which takes ages.’

  20. Consistency - Solutions • Simpler coding system to highlight basic feedback ideas such as spelling or referencing. • ‘I think that would be better with codes then having like overall, like say missing things out, like more complicated things not using codes, more simple things using codes [Yeah] so referencing is quite a common one as well so codes would be better for that. ‘

  21. Communication - Issues • Many of the issues identified by students with the current system of feedback relate to the meta-concept of communication. • Feedback could be greatly improved through a greater and more accessible communication channel between students and lecturers as well as among lecturers. • All previous suggestions would require a good communication system to be implemented successfully.

  22. Communication - Suggestions • An electronic feedback system would increase students’ accessibility to feedback and would promote a greater utilisation of feedback. • An increase in online resources made available to students would increase contact with lecturers. • Personal tutor system to establish a continuing dialogue between students and staff, providing students with more feedback support. • Consistency could be improved through an increased communication among lecturers and the implementation of a consistent structure of feedback.

  23. Differences Between the Year Groups Figure 5. First year students perceive the feedback to be more useful than the grade, whereas the reverse is true for all other years, F(3, 88) = 4.77, MSE = 245.88, p = .004.

  24. Differences Between the Year Groups Figure 6. Masters students are more likely to speak to a lecturer when they are confused with feedback than all other year groups, F(3, 85) = 3.75, MSE = 18.56, p = .014.

  25. Correlations with Grades • Students with lower grades were: • Marginally less satisfied, r(91) = .19, p = .066. • Less satisfied with comments highlighting what they had done well r(91) = .18, p = .081. • Felt feedback was less helpful for use in future work, r(90) = .21, p = .046. • Less likely to collect their feedback, r(88) = .25, p = .019. • Less likely to speak to lecturer, r(87) = .22, p = .035. • Feedback matters less to them, r(89) = .20, p = .058.

  26. Correlations with Grades • Students with higher grades compared to students with lower grades: • Felt that handwritten comments over typed comments showed that lecturers were more interested in their work, β = .01, p = .011. • Preferred annotations over codes, β = .01, p = .001. • Preferred to speak to lecturers about feedback rather than using codes, β = .02, p < .001 or reading paragraphs, β = .01, p = .007.

  27. Correlations with Grades • 66% of students wanted to receive all the comments on their work. • 1/3 of students do not want all of their mistakes highlighted. • Out of the students who wanted only a certain number highlighted, the mean number was 5 (range 2-10). • Students with higher grades wanted all the comments on their work highlighted compared to students with lower grades wanted only a selected number of comments highlighted, β = .01, p = .004.

  28. Recommendations • Online submission of coursework, online marking and online returning of work. • Codes input into the online system should be written out in full on the script. • Hyperlinks from these codes to online resources that will help improve the identified issue. • Accessible to both the student and their personal tutor.

  29. Recommendations • More user friendly VLE with ‘social media’ type activities. • Importance of VLE highlighted and made more accessible to students. • Online booking system for office hours. • Personal tutor to mark all work in first year to make process more transparent and less depersonalised. • Coversheets ask students to answer; what they want feedback to cover, their anticipated grade, detail of feedback required and amount of work they put into assignment. • Feedback consistent across lecturers in content and style.

  30. Recommendations • Positive elements of feedback. • Methods for improving. • Structured simply using bullet points. • Relate to mark scheme to highlight how grade was constructed. • Inform student oh how to get a better mark. • Consistent codes across all forms of work. • Codes should be simpler and more specific. • Work containing written annotations rather than codes.

  31. Recommendations • Feedback workshops with all students in first year as part of a module. • Increased use of VLE. • More concise module guides. • Further communication channels between staff and students. • Better communication between staff to ensure consistent information and feedback.

  32. Conclusion • Focus groups and interviews conducted have already begun to provide a greater understanding of students perspectives of feedback. • The findings suggest that there are some areas of the current feedback system that could use improvement. • The implementation of the suggested changes, informed by a further analysis of the data, aims to put the increased understanding of students perspectives on feedback to use by creating an enhanced feedback experience.

  33. References

More Related