1 / 45

Self and Peer Assessment Through DUO

Self and Peer Assessment Through DUO. Hannah Whaley University of Dundee. Background. University of Dundee Growing use of many forms of self and peer assessment Paper based systems and ad hoc online approaches Re-developed a system to integrate with Bb New system User centered design

elke
Download Presentation

Self and Peer Assessment Through DUO

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Self and Peer Assessment Through DUO Hannah Whaley University of Dundee

  2. Background • University of Dundee • Growing use of many forms of self and peer assessment • Paper based systems and ad hoc online approaches • Re-developed a system to integrate with Bb • New system • User centered design • Involved academic staff from range of subjects • Partnered with Blackboard • Fully integrated with Blackboard • Released in v8.0 so available now

  3. Background

  4. Background

  5. 1. Understanding Self and Peer Assessment 2. Deciding Where and When 3. Designing Assessments 4. Running Assessments

  6. 1. Understanding Self and Peer Assessment 2. Deciding Where and When 3. Designing Assessments 4. Running Assessments

  7. 1. Understanding Self and Peer Assessment • Important to fully understand the concept • Confusion over terminology • Focus on the real use of the pedagogy • Only then can you realise the full potential for learning criterion based reference marking peer marking peer review self assessment peer reflection marking rubrics critical analysis groupwork assessment

  8. 1. Understanding Self and Peer Assessment not group fixed marking criteria individual work self and peer assessment reflection, analysis, evaluation

  9. 1. Understanding Self and Peer Assessment • Process • Academics design assessment • Includes questions and marking criteria • Creates the assessment in Blackboard • Student completes the assessment • Could be one or more questions • Submits answers in Blackboard • Student marks the assessment • Returns to assessment in Blackboard • Is given a list of students to mark • Academic moderates results • Monitors submissions and marking phases • Moderates results before releasing them to students

  10. 1. Understanding Self and Peer Assessment • Challenging process for both staff and students Academics Students Reflection Critical Constructive Engage Be creative Be precise Let go! Moderate Sys Admin Understand, support and get excited

  11. 1. Understanding Self and Peer Assessment 2. Deciding Where and When 3. Designing Assessments 4. Running Assessments

  12. Formative or summative? • Formative works particularly well • Summative should include moderation offer • Replace old assessment or add new one? • Updating old assessments works well • Chance to add innovative new practice • What’s the purpose of the assessment? • Add interaction, reduce marking load, extra practice, new skills • Focus on purpose in assessment design • How long should it run for? • 2 weeks standard – use the defaults that are given • 1 week assessment, 1 week evaluating • Supervise it in IT suites or not? • Generally, can be completed entirely online • Makes good use of practical session 2. Deciding Where and When

  13. 2. Deciding Where and When HIGH LOW interactivity

  14. 2. Deciding Where and When HIGH LOW interactivity

  15. 2. Deciding Where and When HIGH LOW interactivity

  16. 2. Deciding Where and When HIGH LOW interactivity

  17. 2. Deciding Where and When HIGH LOW interactivity

  18. 2. Deciding Where and When Self and Peer Traditional Creates Question Creates Question Prepares Answer Prepares Answer Creates Criteria Creates Criteria Marking Answers Writing Feedback Marking Answers Writing Feedback Moderation Reviews Feedback Reviews Feedback Moderation Formal marks Formal marks Exercise review Exercise review

  19. 1. Understanding Self and Peer Assessment 2. Deciding Where and When 3. Designing Assessments 4. Running Assessments

  20. Focus of assessment • Learning objectives (primary and secondary) • Discipline specific context • Flexibility within tool for design 3. Designing Assessments

  21. 3. Designing Assessments - Essay style and exam style assessments are catered for - Submission options include text, html and links - Anonymous or not, change number of peers to mark Assessment Question Criteria Question Criteria Criteria Criteria

  22. 3. Designing Assessments Example 1 Subject: Life Sciences Motive: Reduce Marking Time Extra Learning: Practice at exam q’s Old or New: Created from old tutorials Very specific criteria Exam style 30 Questions 1 criteria each Model answers

  23. 3. Designing Assessments Example 1 Subject: Life Sciences Motive: Reduce Marking Time Extra Learning: Practice at exam q’s Old or New: Created from old tutorials Subject specific Text answers File uploads Majority no feedback Marking 2 peers and self

  24. 3. Designing Assessments Example 1 Subject: Life Sciences Motive: Reduce Marking Time Extra Learning: Practice at exam q’s Old or New: Created from old tutorials Only 1 exercise used per year previously 4 hours moderating 4 exercises Students gain lots of practice Common mistakes, marking scales, model answers

  25. 3. Designing Assessments Example 2 Subject: Geography Motive: Innovative practical lab Extra Learning: Understand their answers Old or New: New idea Bit of Both 2 Questions Subjective and Specific criteria Granular and expansive marks

  26. 3. Designing Assessments Example 2 Subject: Geography Motive: Innovative practical lab Extra Learning: Understand their answers Old or New: New idea Deep Learning Text answers Subjective and Specific criteria Marking 3 peers and self

  27. 3. Designing Assessments Example 2 Subject: Geography Motive: Innovative practical lab Extra Learning: Understand their answers Old or New: New idea Innovative way to introduce students to academic reading Promoting deep learning – synthesis and evaluation Students forced to give opinions and justify them Makes use of flexibility of the system, combining two approaches

  28. 3. Designing Assessments

  29. 3. Designing Assessments Example 3 Subject: Law Motive: Improve assessment Extra Learning: Give better feedback Old or New: Added online component Blended style File upload Open Criteria, with guidance Small workload

  30. 3. Designing Assessments Example 3 Subject: Law Motive: Improve assessment Extra Learning: Give better feedback Old or New: Added online component Quick answers 3 markers Self reflection Emphasis on constructive feedback

  31. 3. Designing Assessments Example 3 Subject: Law Motive: Improve assessment Extra Learning: Give better feedback Old or New: Added online component Blending online component with existing teaching practices Enhance face to face section, formalise feedback Students get better feedback, from a wider range Understand better and worse presentations clearly

  32. 1. Understanding Self and Peer Assessment 2. Deciding Where and When 3. Designing Assessments 4. Running Assessments

  33. Flexibility built into system • Timing of assessments • Workload • Publishing Results • Motivation • Moderation 4. Running Assessments

  34. 4. Running Assessments • Motivation • Student understanding of process • Importance for their learning • Assignment completed after both parts • Marks can be withheld • Actively encourage non completers • Email and remind them • Sometimes… • Marks for marking • Deviation from average or tutor mark

  35. 4. Running Assessments • Moderation • Moderator can over ride any average grade • 3 key phases each with moderator overview • Submission • Marking • Results

  36. 4. Running Assessments • Moderating Submissions • Encourage • Check submissions for problems • Download submissions

  37. 4. Running Assessments • Moderating Evaluations • Encourage • Check for problems • Download evaluation results

  38. 4. Running Assessments • Moderating Results • Check for problems • Finalise and publish to Grade Center • Any grade can be over-ridden in the Grade Center • Add feedback or grading notes

  39. 4. Running Assessments • Moderation styles • On request (recommended) • Highs and Lows • Unexpected • Random Sample (recommended) • Understand the process • Its student marking • Can’t get a ‘correct’ grade • Accept the average and the learning

  40. Common Mistakes • Not understanding it • Poor criteria • Overloading students • Obsessive moderating

  41. Common Mistakes • Not using the preview • Changing questions and criteria • Changing dates back and forth • Changing enrolments

  42. Benefits • Re-usable resource – shareable good practice • Moved from paper based and ad hoc systems • Promoting really deep learning • Comprehension, application, synthesis, evaluation • Students learn ‘soft skills’ • Giving effective feedback, analysing, criticising • Students gain learning skills • Assessment criteria, marking, answering questions • Students can place their work • See work better and worse than their own, monitor their own learning

  43. Some Ideas… • First drafts • Review resources • Portfolio submission • Video • Past Papers • Research…

  44. Conclusions • Experiences gained using the system for 2 years • Flexible, robust and expandable pedagogy • Challenge in creating challenging assessments • Benefit from experience of moderating • Not always easy • May not get right first time • Inspired, motivated, ideas forming?

  45. Contact • Hannah Whaley, University of Dundee, Scotland • h.whaley@dundee.ac.uk

More Related