1 / 22

Writing in Engineering – peer assessment?

Writing in Engineering – peer assessment?. Peer and collaborative assessment of written coursework in engineering modules Julia Shelton , Jens Mueller School of Engineering and Material Science Teresa McConlogue , Language and Learning Unit. Taught contact hours Problem Based Learning

ansel
Download Presentation

Writing in Engineering – peer assessment?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Writing in Engineering – peer assessment? Peer and collaborative assessment of written coursework in engineering modules Julia Shelton, Jens Mueller School of Engineering and Material Science Teresa McConlogue, Language and Learning Unit

  2. Taught contact hours Problem Based Learning Experiments + write ups Homework Problem solving through numerical examples 12 – 16 hours / week 25 hours / problem 40 hours / semester 3 hours / week 20 hours / week Load on Engineering students

  3. What were the issues? • Content heavy courses, especially in the second year • No room in the curriculum for input on writing • Large classes – difficult to mark writing and give detailed and timely feedback

  4. We wanted students to • be aware of tutor expectations in coursework • develop judgement on the quality of their work • become autonomous learners

  5. What were the strategies? • Embedded writing tasks with clear task instructions and assessment criteria • to improve students’ understandings of tutor expectations • Peer assessment of these tasks • to develop students’ understanding of ‘quality’ in writing in engineering

  6. Planning considerations • the importance of practice marking and preparation for students • the possibility and value of co-constructing assessment criteria • the use of multiple-markers • anonymity issues • the need to give guidance on feedback practices • consideration of procedures for resolving complaints • evaluation strategy

  7. Engineering Case Studies • Level 5, Year 2 Medical materials module • 70 students • Level 4, Year 1 Fluid mechanics module • 280 students • Level 7, Year 4 Computational fluid dynamics module • 20 students • Level 7, Year 4 Implant design and technology module • 20 students

  8. Implementation of peer assessmentlevel 5, 2nd year medical materials module • Students required to submit a 5 page technical lab report in standard structure (provided) • After report submission ran a preparation session • Rehearsal marking of 4 reports, ranked them and in groups discussed grades and comments • In a plenary session, grades were compared and reasons for discrepancies discussed • Reports anonymised, and allocated to peer assessors • Marks and comments submitted rapidly • Mean grades determined by module organiser and comments returned to students within a further week

  9. Issues for consideration • Keeping anonymity • Individual marks given to students or mean mark • Assessing value of marker’s feedback • Checking on accuracy / validity of feedback • Rewarding student feedback • Uncertainty of value of marks from fellow students

  10. Value of peer assessment exercise • Students formally saw other students’ reports • Formulating feedback provided some learning value • Students queried validity of feedback

  11. Developments from 2009 • Implemented in large 1st year, level 4, module • Web interface for peer assessment, including submission, allocation, marks, comments and feedback

  12. Implementation of peer assessmentlevel 7, 4th year Implant design and technology module • Students set task to write a report to a small company CEO to describe the background of a particular joint replacement in order to inform a decision regarding its possible development within the company. • Before students prepared and submitted reports, discussed and evolved suitable marking criteria; co-construction of assessment criteria • Students submitted reports; rehearsal marking on 3 reports • In groups students discussed grades and comments, grades compared and reasons for discrepancies discussed • Each report allocated to 4 peer assessors • marks and comments submitted rapidly • Mean grades determined and comments returned to students within a further week

  13. Features • Introduced co-construction of assessment criteria • Students read and marked material not researched on their own • Used web interface for peer assessment, submission and comments

  14. Student evaluation 2012 • Written open ended evaluation at end of module • Judging the reports • Your writing • Overall comments

  15. Value in undertaking the peer assessment exercise • Learnt from unfamiliar topics ‘learnt on other topics without having to research...good as an overview’ • Saw examples of different styles and reports • Saw clearly what not to do ‘we learnt alot from others, their writing style, things to improve and things to avoid when writing’ • Understood value of report structure ‘I will link my paragraphs more, add subheadings to aid understanding, add images..’

  16. Issues students reported • Difficult to assess unfamiliar topics ‘learning should be given about all the topics before conducting the review’ • Poor quality feedback ‘should make sure everyone puts the same effort in and those who don’t should be penalised’ • Time consuming • Difficult to grade – wanted more guidance on values to award ‘more structured in terms of specifying what makes a report very good (A) or how do you grade a report and give it B, C, D’ • Lacked confidence in final mark

  17. How students would change their writing from this exercise • Change emphasis on elements • Answer the topic more precisely • Change their use of language • Add more tables and figures and describe them more fully • Improve the structure of the report • Focus on ensuring spelling and grammar correct

  18. Recognition ‘ the marking criteria given doesn’t necessarily mean it is exhaustive and contains all factors of what makes an excellent/ideal report... Feedback was based on the marking criteria which was open to interpretation. This doesn’t necessarily mean it is not valid or makes peer assessment difficult, but that it is a subjective process’

  19. Future implementations • Develop peer assessment in each year group • Utilise ranking more widely for higher level learning groups • Embed web based system fully

  20. Conclusions • Peer assessment in SEMS is a useful tool • Students start to judge quality when asked to give feedback – may not have skills to award marks • Peer assessment has several complementary functions • Reliability of marks is not the most important parameter • Students learn from experience

More Related