1 / 48

Review in Computerized Peer-Assessment

Review in Computerized Peer-Assessment. Dr Phil Davies Department of Computing Division of Computing & Mathematical Sciences FAT University of Glamorgan. What do we need to provide to have fully Automated Peer –Assessment System?. AUTOMATICALLY

kosey
Download Presentation

Review in Computerized Peer-Assessment

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Review in Computerized Peer-Assessment Dr Phil Davies Department of Computing Division of Computing & Mathematical Sciences FAT University of Glamorgan

  2. What do we need to provide to have fully Automated Peer –Assessment System? AUTOMATICALLY CREATE A MARK THAT REFLECTS THE QUALITY OF AN ESSAY/PRODUCT VIA PEER MARKING, SLIGHT GAP AUTOMATICALLY A MARK THAT REFLECTS THE QUALITY OF THE PEER MARKING PROCESS i.e. A FAIR/REFLECTIVE MARK FOR MARKING AND COMMENTING

  3. THE FIRST CAP MARKING INTERFACE

  4. Typical Assignment Process • Students register to use system - CAP • Create an essay in an area associated with the module using an RTF template of headings • Submit via Bboard Digital Drop-Box • Anonymous code given to essay automatically by system • Use CAP system to mark

  5. Self/Peer Assessment • Often Self-Assessment stage used • Set Personal Criteria • Opportunity to identify errors • Get used to system • Normally peer-mark about 5/6 • Raw peer MEDIAN mark produced • Need for student to receive Comments + Marks

  6. CompensationHigh and Low Markers • Need to take this into account • Each essay has a ‘raw’ peer generated mark - MEDIAN • Look at each student’s marking and ascertain if ‘on average’ they are an under or over marker • Offset mark given by this value • Create a COMPENSATED PEER MARK

  7. Below are comments given to students.Select the 3 most Important to YOU • I think you’ve missed out a big area of the research • You’ve included a ‘big chunk’ - word for word that you haven’t cited properly • There aren’t any examples given to help me understand • Grammatically it is not what it should be like • Your spelling is atroceious • You haven’t explained your acronyms to me • You’ve directly copied my notes as your answer to the question • 50% of what you’ve said isn’t about the question • Your answer is not aimed at the correct level of audience • All the points you make in the essay lack any references for support

  8. Each Student is using a different set of weighted comments Comments databases sent to tutor

  9. Comments – Both Positive and Negative in the various categories. Provides a Subjective Framework for Commenting & Marking First Stage => Self Assess own Work Second Stage (button on server) => Peer Assess 6 Essays

  10. Feedback Index • Produced an index that reflects the quality of commenting • Produced a Weighted Feedback Index • Compare how a marker has performed against these averages per essay for both Marking + Commenting – Looking for consistency

  11. The Review Element • Originally in Communications within CAP marking process, it requires the owner of the file to ‘ask’ questions of the marker • Emphasis ‘should’ be on the marker • Marker does NOT see comments of other markers who’ve marked the essays that they have marked • Marker does not really get to reflect on their own marking – get a reflective 2nd chance • I’ve avoided this in past -> get it right first time

  12. Click on button to get an essay previously marked + comments and marks

  13. Click on button to get to view comments of another marker

  14. Can change any marks and/or comments they feel appropriate and submit by clicking button

  15. Trialled with Post-Graduate Group • 13 students • 76 markings • Average time per marking = 42 minutes (range 3-72) • Average number of menu comments/marking = 15.7 • Peer Avge. mark = 59.69% (before review 60.15%) • Number of students who did replacements = 10 (out of 13) • 41 ‘Replaced’ markings (54%) • Out of 41 Markings ‘Replaced’ –> 26 changed mark 26/76 (34%) • Only 33 out of 41 REALLY CHANGED ANYTHING • 2 students ‘Replaced’ ALL his/her markings • -8, -7, -7, -7, -6, -6, , -6, -5, -5, -5, -4, -4, -3, -3, -2, -2, -2, -2, -1, -1 • +1, +2, +6,+7, +8, +9

  16. Mapping of Feedback Indexes to Compensated Peer Essay Marks

  17. -1 -1 -2 -9 -4 -5 -1 -4 Initial Average Self Assessment 68.33% Reflective Average Self Assessment 64.63%

  18. +3 -1 +4 -2 +6 -5 -5 +1 +5 -4 -5 Raw Peer Generated Mark Pre-Review 60.38% Compensated Peer Generated Mark Pre-Review 60.15%

  19. +5 -1 -1 +4 -2 +6 +1 -5 -5 0 +5 -5 -2 Raw Peer Generated Mark Post-Review 59.69% Compensated Peer Generated Mark Post-Review 59.69%

  20. Student Mark Changes During Review Stage 9 30-39 8 58-66 7 79-86 6 14-20; 40-46 8 67–59 7 73-67; 67-60; 60-53 6 73-67; 52-46 AVERAGE MARK CHANGE = -1.69

  21. How to work out Mark (& Comment) Consistency • Marker on average OVER marks by 10% • Essay worth 60% • Marker gave it 75% • Marker is 15% over • Would expect 10% over, therefore Actual Consistency index (Difference) = 5 • If the marker on average had UNDER marked by 10% - Difference would have been 25 • Summing and Averaging these differences produces a Marking Consistency Index (low is good – high is poor) • This can be done for all marks and comments

  22. -1.26 -0.94 -0.24 -1.37 +0.13 -0.41 -4.04 -0.15 -0.85 -1.74 +0.78 Would hope for AVERAGE MARK DIFFERENCE to DECREASE following review -1.00 -> -0.72 Would hope for MARK CONSISTENCY to DECREASE following review 6.30 -> 5.38

  23. Automatically Generate Mark for Marking • Linear scale 0 -100 mapped directly to consistency … the way in HE? Expectation of Normalised Results within a particular cohort / subject / institution? • Map to Essay Grade Scale achieved (better reflecting ability of group)?

  24. Current ‘Simple’ method • Average mark for essay e.g. 55% • Ranges Highest – Lowest marks achieved for essay e.g. 45% <-> 70% • Average Marking Consistency e.g. 5.0 • Ranges Highest – Lowest consistency indexes achieved e.g. 2.5 <-> 8.0 Essay 45 <- (10) -> 55% <- (15) -> 70 Mark 8.0 <- (3) -> 5.0 <- (2.5)-> 2.5 Essay/Mark 3.33 / 1 6.0 / 1 e.g. Mark Cons = 6.0 > 1 below average Mark for marking = 55% - (1*3.33)= 51.66%

  25. What about the commenting? • Does not take into account the Quality of the Commenting • Should look at the Average Feedback Differences per marker to get a Commenting Consistency Grade • Same as creating Mark Consistency • Create a Commenting Consistency Index

  26. Correlation between Marking & Commenting Consistency 0.49

  27. Correlation between Marking Consistency and Essay Mark 0.17

  28. Final Grade for Coursework takes into account Essay Grade, Mark for Marking and Mark for CommentingPercentages Correlation between Commenting Consistency and Essay Mark 0.05

  29. Split in Marks 60 / 20 / 20 • Is it reasonable? • Higher Order skills of Marking – worth more? • If we’re judging marking process on consistency – then should be rewarded for showing consistency within marking AND commenting • Revised split 60 / 15 / 15 / 10

  30. Correlation between Final ESSAY grade & Mark for Marking, Commenting & Consistency is 0.54 Correlation between FINAL MARK & ESSAY GRADE is 0.85

  31. IS IT WORTH THE HASSLE??

  32. Student Comments • Have you used Peer-Assessment in Past? • How did you find self-assessment? • Creating the Comments Database? • How did they find using the CAP system and peer-assessment? • Thoughts on new Review Stage? • Thoughts on Mark for Marking?

  33. Two Main Points to Consider • How do we assess the time required to perform the marking task? • What split of the marks between creation & marking • Definition • Student or Lecturer Comments

  34. Contact Information Email: pdavies@glam.ac.uk Phone: 01443 - 482247 Dr Phil Davies J317 Department of Computing & Mathematical Sciences Faculty of Advanced Technology University of Glamorgan

  35. ANY QUESTIONS OR COMMENTS

  36. THE END

  37. General View of Peer AssessmentLecturer or Student? Lectures getting out of doing their jobs i.e. marking Good for developing student skills & employability How can all students be expected to mark as ‘good’ as ‘experts’ Why should I mark ‘properly’ and waste my time - I get a fixed mark for doing it The feedback given by students is not of the same standard that I give. The best thing I’ve ever done to make me reflect VERY PERSONAL salary WANT BENEFITS NOW!!

  38. Defining Peer-Assessment • In describing the teacher .. A tall b******, so he was. A tall thin, mean b******, with a baldy head like a light bulb. He’d make us mark each other’s work, then for every wrong mark we got, we’d get a thump. That way – he paused – ‘we were implicated in each other’s pain’ McCarthy’s Bar (Pete McCarthy, 2000, page 68)

  39. Student Comments • Used Peer-Assessment in past? • None to any real degree • A couple for staff development type activities

  40. Student Comments How did you feel about performing Self-assessment? • Very Difficult. • Helped to promote critical thinking ready for peer-assessment stage. • Made me think about how I was going to assess others

  41. Student Comments • Creating Comments Database? • Very difficult – not knowing what comments they’d need • Weighting really helped me create criteria ready for marking • Could have helped to do dummy marking

  42. Student Comments • How did they find using the CAP system and peer-assessment? • Very positive & interesting • Very time consuming • Would do it better next time • Important to maintain anonymity • Interesting & complex – thought more about the assessment process • Really helped student development

  43. Student Comments • Thoughts on new Review Stage • Liked 2nd chance to review own marks • Gained experience going through process • Didn’t really take much note of peers’ comments • Liked to see that others felt the same about an essay

  44. Student Comments • Thoughts on Mark for Marking • Good rewarded appropriately • Difficult to fully understand • Let owner of essay provide mark for marking

More Related