1 / 24

Using Peer Assessment & Technology to Support Learning & Reflection

Using Peer Assessment & Technology to Support Learning & Reflection. Fang Lou, Steve Bennett, Trevor Barker School of Life Sciences/Computer Science. Introduction. Overview of LTI project Course in two schools (COM and LFS) Level 4 BSc CS, Emedia Design,

badu
Download Presentation

Using Peer Assessment & Technology to Support Learning & Reflection

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Using Peer Assessment & Technology to Support Learning & Reflection Fang Lou, Steve Bennett, Trevor Barker School of Life Sciences/Computer Science

  2. Introduction • Overview of LTI project • Course in two schools (COM and LFS) • Level 4 BSc CS, Emedia Design, • Level 4 BSc Bio Sci HP & Sport FOHP • Winning over difficult cohorts • Developing HOT skills • Supporting reflective learning • Improving performance

  3. The technology • Electronic Voting Systems • CS: Used for peer marking ofprevious cohort’s work • LFS: workshop for students’ opinion • Google Spreadsheets: • Feedback in great detail 220 students x 3 marks sheets containing 15 items in each (CS) • Optical mark sheet: • Save time to input marks (LFS) • Online data collection: • Reflection and feedback from students (LFS) CV

  4. Peer Assessment • Large amount of research papers on the beneficial effects of peer assessment on student motivation and ability to self assess • Three comprehensive meta-studies: • Topping 1998, • Falchikov and Goldfinch 2000, • Zundert et al 2010 • Significant JISC projects REAP and PEER

  5. The E Media Design Course • A level 4 BSc course on the theory and practice of digital media • Students didn’t generally apply the design theory to their creative artifacts • Peer Evaluation of Previous Cohort’s work Introduced • Produced an increase in student attainment but caused some student hostility

  6. The Problem (or aftertaste) 2009 (Before Using Peer Assessment) 25% MCQ 25% MCQ 50% Flash CV 59% N=290 58% N=277 2010 (With Peer Assessment) 25% EVS Peer Marking Exercise 25% MCQ 50% Flash CV 64.5% N=218 56% N=215 Learner 1: “(.. for example the marking criteria, it’s all over the place, how can we be tested on someone’s opinion??) so who knows. Learner 2: Maybe we will just guess what they are thinking. I'm confused, how can a test be solely based on someone else opinion, this means even if you can justify why you've put a certain option it doesn't matter because your opinion doesn't mean anything.”

  7. 2011 Version: The measures • Better marking proportions • Better selection of exemplars • The six pieces of previous cohort work with the lowest variance between markers • Rewriting of criteria • More detailed and graduated attainment descriptors • Using a Live EVS session instead of QMP • Extremely detailed feedback on student marking 10% Peer Marking Event 30% MCQ 60% Flash CV

  8. Result

  9. Result

  10. Result

  11. Fundamental Issue • Does using peer-assessment help with the internalization of assignment criteria and expected standards? • It seems so • But some students potentially merely regard it as being asked to second guess the tutors. • There was far less controversy with the revised approach • Students seemed to be buying in.

  12. Some Issues • This result was the culmination of 3 years research. We are not convinced that simply using without a great deal of additional work would necessarily be as effective. • Setting up the sessions, writing and revising rubrics, selling the system to students and moderators, producing support lectures and materials etc. was not easy and took a great deal of time.

  13. Issues • Student concerns had to be dealt with. • The EVS is often used in formative contexts – quiz rather than assessment. (Does this devalue it?) • It is not absolutely reliable • It requires experience to: • Manage large EVS sessions (200+ students) • Write and reflect on rubrics and presentations • Collects and configure good samples for the sessions • Collect data • Set up the PPTs, software and hardware • Relate handset numbers to students’ names (not that easy)

  14. Peer Assessment of a Full Lab Report- differences in two cohorts • Level 4 BioScience (Bio) programme has been doing it for 5 years (Human Physiology module) – positive • Level 4 Sports Science and Sports Therapy (Sports), introduced last year (Foundations of Human Physiology module) – quite negative • The disparity between Bio and Sports • Challenge – winning over the SES/SPT cohort

  15. Sequence of events 1. Laboratory study 2. Workshop after all students had completed the laboratory class – briefing 3. Submission of the report 4. Peer assessment (marking session): clear marking criteria 5. Appeal/reflection/feedback: face-to-face? Email? Online?

  16. EVS question – What do you think about peer assessment? • Glad to have a go • Curious to find out how it works • Would prefer lecturers to mark my work • Not comfortable with the responsibility Bio Sports

  17. The problem • Attitude – students did not see the point of doing it • No incentive – don’t care marking well or not • Disappointment – when receive a carelessly marked report • Results – lower level of satisfaction and huge amount of complain and staff moderation

  18. The measures • Link to professionalism – selling the idea of peer assessment right from the Induction week; stress again in the workshop • Reduce peer mark allocation from 20% to 10% • Allocate 5% for marking • Example of a report in the marking session • Moderate reports before releasing marks • Introduce the online feedback system (WATS) – 5% to sports module

  19. Findings • More positive attitude – EVS results • High engagement – many students wrote a full page of comments • Raised satisfaction – reflection and feedback results

  20. Return rates

  21. What do the students think? Sports (FOHP) Bio (HP)

  22. I didn't find it useful as the report I was marking hugely lacked in effort the lab reports amongst peers also aided the learning process as you get another perspective. The discussions it raised while marking the lab reports amongst peers also aided the learning process It has made me reflect more deeply than normal I was surprised that it would help me with my lab report and in the future It is beneficial to do, however I do not think it needs to be done all the time. Once or twice is enough to get a general idea of how the marking works and how to improve your work.

  23. Key points for success • Organisation of sessions • Making sure the technology works • Marking criteria • Choosing Exemplars • Continuous improvement based on reflection • Selling the idea to students including briefing • Encouraging students: Reflection and Feedback • Technology can help

  24. Acknowledgements • LTI Support – Enhancement Awards 2011-12 • Fang’s colleagues (Mike Roberts and Chris Benham) • References • Barker, T. & Bennett, S., (2010), Marking Complex Assignments Using Peer Assessment with an Electronic Voting System and an Automated Feedback Tool, Proceedings of International Computer Assisted Assessment (CAA 2010),20-21 July, 2010, Southampton, UK. • Barefoot, H., Lou, F. & Russell, M. (2011) Peer Assessment: Educationally Effective and Resource Efficient . Blended Learning in Practice, May, 2011 • Bennett, S. & Barker, T (2011a), Using Electronic Voting and Feedback: Developing HOT Skills in Learners, presented at SOLSTICE 2011, June 8-9, Edge Hill University, UK • Bennett, S. & Barker, T (2011b), The Use of Electronic Voting to Encourage the Development of Higher Order Thinking Skills in Learners. , Proceedings of International Computer Assisted Assessment (CAA 2011), July, 2011, Southampton, UK • Lou, F.,Barefoot, H., Bygate, D. and Russell, M. (2010) Using technology to make an existing assessment more efficient. Poster at theInternational Blended Learning Conference. June, University of Hertfordshire, UK

More Related