1 / 19

UK AIS 2007

UK AIS 2007. Addressing Student Satisfaction in Undergraduate Computing Programmes - the use of constructive alignment in automating mass, timely and high quality feedback. 12 th April 2007 Cobham – Jacques - Spilberg. David Cobham Head of Department of Computing and Informatics

hea
Download Presentation

UK AIS 2007

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. UK AIS 2007 Addressing Student Satisfaction in Undergraduate Computing Programmes - the use of constructive alignment in automating mass, timely and high quality feedback 12th April 2007 Cobham – Jacques - Spilberg

  2. David Cobham Head of Department of Computing and Informatics Kevin Jacques Senior Lecturer in Computing Rose Spilberg Senior Lecturer in Computing Constructive Alignment implementation process from 2000 onwards This is not intended to be a prescription Background Information

  3. Constructive alignment Background Process Assessment strategies Paradigm shift Criterion Reference Grids Improving Feedback Generic CRG statements Mass feedback Enhancements Evaluation Students External Examiners Academic Staff Conclusions Questions Structure

  4. A process to ensure the outcomes of every teaching session are in line with: the learning requirements of the unit/module the learning requirements of the programme/award And that the assessment strategies employed match with the learning outcomes of the award Typically a three phase approach Curriculum alignment Assessment alignment Pedagogic alignment Constructive Alignment

  5. Existing Unit LOs New Unit LOs Bloom’s Taxonomies Computing Benchmarks Curriculum Alignment Alignment as a Process

  6. Initial process undertaken by Theme - validation was across the Department Sanity check Unit LOs checked for consistency, typos and duplication Mapping exercise Unit LOs numbered and mapped against programme level LOs Verb analysis Some refinements required Minor modifications to units Process Validation

  7. A Paradigm Shift Curriculum alignment based on competency Computing benchmark statements address ‘Threshold’ and ‘Modal’ performance Based on these we identify an expectation to our students for each assessment at 3rd and 2:2 level We then add 2:1 and 1st class benchmark statements For each LO We then GRADE we do not MARK Some LOs need decomposition so we provide criterion statements to identify what is being assessed Articulated through a Criterion Reference Grid (CRG) Assessment strategies

  8. Criterion Reference Grid

  9. Assessment calculations Initial approach was based on boundary marks Now at mid-points with some flexibility built in Student Briefing and Feedback CRG became the mechanism for assessment brief and feedback. In early attempts the language of the grid was not seen as positive or individualistic enough to be considered good quality feedback Now typically enhanced with personalised feedback This personalisation takes time Procedural Implications

  10. Grade point mechanism To improve feedback to students failing to meet the threshold performance level ‘fail grade’ points were established To allow credit for outstanding performance ‘First +’ and ‘First ++’ grade points were also added Resulting in a 10 point grading scale Grading therefore can easily be done ‘mechanistically’ Improving Feedback

  11. Many academics grade using spreadsheets and handwrite personal feedback on feedback sheets However by using spreadsheet lookup functions a generic feedback statement for each LO can be used to construct a ‘feedback rubric’ that reads like individual comments By creating a feedback rubric for each ‘box’ on the CRG that reflects the performance typified by that box, feedback is enhanced beyond the CRG statements Improving Feedback

  12. A single spreadsheet therefore stores all data relating to grade points and feedback comments for a cohort Using standard Office Suite applications this data generates three feedback documents An e-mail sent directly to the students e-mail address A feedback sheet that is printed and secured to the transcript for EE moderation and subsequent return to the student A file copy for the student file Mass Reporting

  13. Ranking In class tests do not have CRGs One tutor used the automated feedback mechanism to report back performance in a level one in class test The feedback included notification of cohort ranking as well as the grade achieved Received very positively by the students Positivity Addition to the rubric of comments that directly address possible improvement strategies Enhancements

  14. Re-use In units where critical and analytical skills are assessed the grade point rubric can be reused in subsequent iterations of the unit Even when the assessment changes Case study assessments can be ‘framed’ by the rubric even when different cases are used Regulatory Issues Statements relating to resit opportunities, final grade approval, etc are now added to the feedback Enhancements

  15. Student perception Is still that personalised feedback is given Timeliness of feedback - a dramatic improvement Unit evaluation statistics have improved External Examiners Quality of feedback has been favourably commented upon Academic staff Some still do not like to grade on spreadsheets Evaluation

  16. Constructive alignment is very tough (especially in the early stages) With large cohort sizes making quality feedback a time consuming task it can offer scope for automation The quality of our feedback is seen as being ‘as good as, if not better than before’ Conclusions

  17. David Cobham dcobham@lincoln.ac.uk Kevin Jacques kjacques@lincoln.ac.uk Rose Spilberg rspilberg@lincoln.ac.uk Department of Computing and Informatics, University of Lincoln, Lincoln LN6 7TS Contact Details

More Related