1 / 56

Enhancing Assessment and Feedback Principles and Practice David Nicol

Enhancing Assessment and Feedback Principles and Practice David Nicol Professor of Higher Education Centre for Academic Practice and Learning Enhancement (CAPLE] Director of REAP and PEER projects ( www.reap.ac.uk ) University of Strathclyde, Scotland ATN Australia: 20 th October 2011.

genica
Download Presentation

Enhancing Assessment and Feedback Principles and Practice David Nicol

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Enhancing Assessment and Feedback Principles and Practice David Nicol Professor of Higher Education Centre for Academic Practice and Learning Enhancement (CAPLE] Director of REAP and PEER projects (www.reap.ac.uk) University of Strathclyde, Scotland ATN Australia: 20th October 2011

  2. National Student Survey (UK) Assessment & feedback (2008)

  3. Plan Background Re-engineering Assessment Practices (REAP) project [£1m] Concepts and example of practice An institutional viewpoint PEER project

  4. Background Departments and faculties: supporting educational improvement projects. REAP project: big implementation across 3 universities Policy/strategy: led development of policy in assessment and feedback [based on REAP] Students: ‘Feedback as dialogue’ campaign PEER project – JISC funded [£50k] HE Sector: Project facilitator for QAA Scotland on A&F Research/publications: assessment, learning, change See www.reap.ac.uk

  5. REAP : Re-engineering Assessment Practices • Scottish Funding Council for Universities (£1m project) • 3 Universities - Strathclyde, Glasgow & Glasgow Caledonian • Large 1st year classes (160-600 students) • A range of disciplines (19 modules ~6000 students) • Many technologies: online tests, simulations, discussion boards, e-portfolios, e-voting, peer/feedback software, VLE, online-offline • Learning quality and teaching efficiencies • Assessment for learner self-regulation • www.reap.ac.uk

  6. Background (1) • Gibbs, G. & Simpson, C (2004) Conditions under which assessment supports students’ learning, Learning and Teaching in Higher Education, 1, 3-31. See: • Formative Assessment in Science Teaching (FAST) project at: http://www.open.ac.uk/science/fdtl

  7. Gibbs and Simpson (2004) Assessment tasks [Conditions 1-4] • Capture sufficient study time (in and out of class) • Are spread out evenly across timeline of study • Lead to productive activity (deep vs surface) • Communicate clear and high expectations i.e concern here is with ‘time on task’ how much work students do - their active engagement in study

  8. Background (2) Literature Review • Nicol, D. & Macfarlane-Dick, D. (2006). Formative assessment and self-regulated learning: A model and seven principles of good feedback practice. Studies in Higher Education, 34 (1), 199-218 Background • Student Enhanced Learning through Effective Feedback [SENLEF] project funded by HE Academy • REAP project: www.reap.ac.uk

  9. Rethinking assessment and feedback 1.Consider self and peers as much as the teacher as sources of assessment and feedback • Tap into different qualities than teacher can provide • Saves time • Provides considerable learning benefits (lifelong learning) • 2. Focus on every step of the cycle: • Understanding the task criteria • Applying what was learned in action • 3. Not just written feedback: • Also verbal, computer, vicarious, formal and informal

  10. Seven principles of good feedback Good feedback: • Clarifies what good performance is (goals, criteria, standards). • Facilitates the development of reflection and self-assessment in learning • Delivers high quality information to students: that enables them to self-correct • Encourages student-teacher and peer dialogue around learning • Encourages positive motivational beliefs & self esteem • Provides opportunities to act on feedback • Provides information to teachers that can be used to help shape their teaching (making learning visible) Source: Nicol and Macfarlane-Dick (2006)

  11. Principle 1: Clarify what good performance is (the context of dialogue) EMPOWERMENT/ SELF-REGULATION • Students create criteria • Students add own criteria • Students identify criteria from samples of work • Exemplars of different performance levels provided • Students rephrase criteria in own words • Provide document with criteria ENGAGEMENT

  12. Two meta principles 1. Meta-PRINCIPLE 1: time and effort on task (structured engagement) i.e. steers on how much work to do and when – Gibbs and Simpson 4 conditions 2. Meta-PRINCIPLE 2: developing learner self-regulation (empowerment/self-regulation) i.e steers to encourage ownership of learning – the seven principles discussed above. Key task for teacher is to balance 1 and 2

  13. Example: Psychology

  14. Psychology 560 first year students 6 topic areas (e.g. personality, classical conditioning), 48 lectures, 4 tutorials, 12 practicals Assessment; 2 x MCQs (25%), tutorial attendance (4%), taking part in experiment (5%), essay exam (66%)

  15. Problems identified No practice in writing skills but required in the exam More detail provided in lectures than mentioned in exams (not enough independent reading) No feedback except on Multiple Choice Questions (percent correct) Didn’t want to increase staff workload

  16. Psychology Redesign Discussion board in Learning Management System Students in 85 discussion groups of 7-8, same groups throughout year Also open discussion board for class Friday lectures cancelled – discover material themselves Series of online tasks

  17. Structure of group tasks • 6 cycles of 3 weeks (one cycle x major course topic) • First week: ‘light’ written task (e.g. define terms) = 7 short answers (all answer) • Second week = guided reading • Week three: ‘deep’ written task: students collaborate in writing a 700-800 word essay on the same topic. • Within each week: • The Monday lecture – introducing material • Immediately after lecture, task posted online – for delivery the following Monday • Model answers (selected from students) posted for previous week’s task

  18. The teaching role Participation in the discussions was compulsory but not marked (in subsequent years there was 2% mark for participation) Course leader provided general feedback to the whole class – often motivational He encouraged students to give each other feedback The group discussions were not moderated but monitored for participation

  19. An example of ‘deep’ task The Task – 800 word essay: Assess the strengths and weaknesses of Freud’s and Eysenck’s theories of personality. Are the theories incompatible? readings suggested questions provided which all students should try

  20. Relation to the Gibbs & Simpson’s four assessment conditions Tasks require significant study out of class (condition 1) Tasks are distributed across topics and weeks (condition 2) They move students progressively to deeper levels of understanding (condition 3) There are explicit goals and progressive increase in challenge (condition 4)

  21. Relation to 7 feedback principles Standard format and model answers provide progressive clarification of expectations (principle 1) Students encouraged to self-assess against model answer (principle 2) Course leader provides motivational and meta-level feedback and selects model answers (principle 3) Online peer discussion aimed at reaching consensus is core feature of design about response (principle 4) Focus on learning not just marks, sense of control/challenge enhanced motivation (principle5) Repeated cycle of topics and tasks provide opportunities to act on feedback (principle 6) VLE captures all interactions allowing course leader to monitor progress and adapt teaching (principle 7)

  22. Benefits Students worked exceptionally hard Written responses of exceedingly high standard Students took responsibility for learning High levels of motivation: atmosphere in class improved Online interactions showed powerful ‘scaffolding’ and community building Feedback with 560 students through peer and self-feedback (model answers) Easy for tutors to monitor participation Improved mean exam performance (up from 51-59%, p<0.01) weaker students benefit most

  23. Online postings/interaction 24,362 messages posted by groups for essay tasks Average number of postings per student 44.3 1067 postings to general open discussion forum Students set up online study groups for other subjects Structured tasks online triggered important socio-cognitive processes

  24. Has it worked?

  25. Questions and Discussion

  26. Why use principles Provides a framework for operationalizing the big idea – the development of learner self-regulation Helps translate the research into accessible guidelines for teaching practice Ensures change is educationally driven Enables connections to be made across innovations in different disciplines Provides a common language to talk about innovation and for dissemination Can add to the evaluation – process measures Helps identify where technology can leverage benefits

  27. Guidelines for Implementation A single principle or many? Tight-loose – maintain fidelity to the principles (tight) but encourage disciplines to develop their own techniques of implementation (loose) Balance teacher feedback with peer and self-generated feedback The more actively engaged students are, the better the course design

  28. Developments since REAP Principles of Assessment and Feedback approved by University Senate and embedded in policy (2008) Use of principles to inform curriculum renewal and Quality Assurance processes ‘Feedback as Dialogue’ campaign to gain commitment of students PEER Project (Peer Evaluation in Education Review)

  29. Peer Review in Education Evaluation [PEER] The aims of the PEER project are to: Review evidence base for peer review Develop educational designs for peer review (and self-review) Identify software support for peer review Pilot implementations of peer review with large student numbers Produce guidelines for higher education – why do it, how to do it, pitfalls and solutions and software possibilities. see http://www.reap.ac.uk/peer.aspx

  30. Peer feedback: augmenting teacher feedback Increasing quantity and variety of feedback No extra workload on teacher with software (e.g. PeerMark, Aropa) More timely – e.g. collaborative projects Simulates professional life – reconciling different feedback perspectives

  31. The argument Not enough attention has been focused on the potential of peer feedback not just as a way of increasing the quantity and quality of the feedback students receive, but as a way of giving students practice in constructing feedback.

  32. The focus Scenarios where students make evaluative judgements about the work of peers and provide a feedback commentary, usually written Not talking about scenarios involving ..….informal feedback in collaborative tasks .....students evaluating each other’s contribution to group working .....students grading/marking each other’s work, although some rating might be part of peer design

  33. Benefits of feedback construction (1) Constructivist rather than ‘telling’ paradigm High-level cognitive activity: students cannot easily be passive Students actively exercise assessment criteria from many perspectives Writing commentaries develops deep disciplinary expertise See many approaches and learn that quality can be produced in different ways Shifts responsibility to student – puts them in role of assessor exercising critical judgement

  34. Benefits of feedback construction (2) Learn to assess own work – as exactly the same skills are involved Develops capacity to make evaluative judgements - a fundamental requirement for life beyond university. Also, this capacity underpins all graduate attribute development (Nicol, 2010) Nicol, D (2011) Developing students’ ability to construct feedback, Published by QAA for Higher Education, UK

  35. Example 1:Peer feedback Laboratory work [Gibbs 2011 www.testa.ac.uk/resources] Reason – poor quality of lab reports in science Strategy – students organised in groups and produce poster to represent their lab report Peer process – hang poster in class and all students individually walk round analyse posters and write feedback on them (e.g post-its): questions, suggestions, inaccuracies etc Result – significant improvements in lab work and reporting, positive competition in class, students did not want to look bad.

  36. Example 2: Engineering Design Peer Project case study • DM 100 Design 1: first-year class • Dr Avril Thomson, Course Leader, Design Manufacturing and Engineering Management (DMEM), University of Strathclyde avril.thomson@strath.ac.uk • Caroline Breslin, Learning Technology Adviser, University of Srathclyde caroline.breslin@strath.ac.uk

  37. Example 2: Design 1 • 82 first-year students • Design a product – ‘theme eating and resting in the city’ • Research in groups (in city, in library etc.) • Individually produce a Product Design Specification (PDS) – detailed requirements for and constraints on design (rationale, performance,standards, manufacturing etc) • Given a PDS exemplar from another domain to show what’s required (stainless steel hot water cylinder) • Online learning environment: Moodle and PeerMark part of Turnitin suite

  38. Product Design SpecificationTypical Headings

  39. DM 100: Design 1 • Peer review task • Individually, each student peer-reviewed and provided feedback on the draft PDS of two other students • Criteria: completeness, convincingness of rationale, specificity of values (performance) and one main suggestion for improvements with reasons • Students used experience, giving and receiving feedback, to update PDS which comprises part of a Folio • Students’ self-review Folio and meet/discuss with tutor • Peer review not assessed directly but 10% marks for professionalism which included participation in peer review.

  40. Peer review rubric: DM 100 Do you feel the PDS is complete in the range of headings covered? If no, can you suggest any headings that would contribute towards the completeness of the PDS and explain why they are important? Is the PDS specific enough? Does it specify appropriate target values or ranges of values? Please suggest aspects that would benefit from further detail and explain. To what extent do you think the rationale is convincing for the PDS? Can you make any suggestions as to how it might be more convincing? Please explain. Can you identify one main improvement that could be made to the PDS? Provide reason(s) for your answer.

  41. Evaluation Online survey completed by 64 students Course work marks compared to previous years Focus group interviews Peer review comments recorded online

  42. Results 1 • Which aspects of the peer review did you learn from? • Giving feedback 10.9% • Receiving feedback 26.6% • Giving and receiving feedback 54.7% • Neither giving or receiving 7.8%

  43. Results 2 Did you modify your initial submission as a result of the peer review activity?

  44. Results: student comments • If yes, please give specific examples of modifications (n=41) • I added a couple of paragraphs and improved existing paragraphs, this added two full A4 pages to my work • I included specific materials as changed the formatting of the document so it looked more professional • I provided more specific numeric values and expanded my rationale after seeing someone else’s PDS and after receiving feedback

  45. Results: RECEIVING feedback • Please give examples of what you learned from RECEIVING peer reviews from other students (n=54) • Parts that I had previously missed were brought to eye such as market competition (noticing) • Receiving peer reviews gave me insight into what others thought of my work and gave me a direction to improve (reader response) • Where the PDS was confusing to understand (reader response) • I found out how good mine was (motivational) • The person who peer reviewed my PDS gave me positive feedback which helped me a lot (motivational) • Not much, they weren’t very good

More Related