1 / 33

Improving student success through implementation of weekly, student unique, CAA tutorial sheets.

Improving student success through implementation of weekly, student unique, CAA tutorial sheets. Mark Russell & Peter Bullen University of Hertfordshire. Why listen to me?. This CAA implementation has made a difference! Student attendance is up. Exam performance is up.

kaipo
Download Presentation

Improving student success through implementation of weekly, student unique, CAA tutorial sheets.

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Improving student success through implementation of weekly, student unique, CAA tutorial sheets. Mark Russell & Peter Bullen University of Hertfordshire

  2. Why listen to me? • This CAA implementation has made a difference! • Student attendance is up. • Exam performance is up. • Retention of students is likely to be up. • This project is transportable to other areas.

  3. Background. • First Year module in Fluid mechanics and Thermodynamics. • ~ 150 Student • 4 teaching staff on team • Variety of teaching and learning settings. • Explicit use of Universities MLE (StudyNET).

  4. So why bother? • Poor exam performance. Possible causes include- • The language of subject is probably new to many students. • There is a need for some mathematical ability. • Attendance at lectures and tutorials is patchy. • The exam performance is particularly concerning given the development of structured learning materials and a real desire to support the students using StudyNET. • We believed we were doing our bit!

  5. The perceived problem. • Students did not always expend the required effort. • Tutorial questions remained unanswered. • Revision time became the primary learning time! • Did the assessment process actively support the learning?

  6. What do we want to do? • Actively encourage (force!) students to engage with all the materials. Why - • Consolidates learning. • Helps apply the new language. • The maths is problem oriented. • Develops confidence in students ability. • Forces a structured study pattern.

  7. How did we do it? • Developed an integrated, summative, assessment regime. It’s key features were • Weekly. • Student unique. • Forcing.

  8. The integrated assessment regime included • A student unique, Weekly Assessed Tutorial Sheet (WATS) • An evolving automated marking sheet. • A manual nudge to non-participants. • A full worked solution (uploaded to StudyNET after student submissions). • A report on the groups weekly performance. • Some generic notes on issues observed.

  9. How did we do it? ‘Word’ sets generic question • ‘Excel’ creates randomised numeric data • ‘Mail merge’ combines generic question with randomised numeric data to create a student unique WATS. • WATS • StudyNET is used to deliver questions to the students

  10. What did it look like? • 11 WATS were developed. • Each WATS had parent and child questions • Each WATS had to be done within one week. • The students had to submit a hard copy of the their results sheet. • A supplementary Excel marking sheet was developed to help the MANUAL marking! • Marking and analysis of the groups performance was prompt. Within 1 day of submission.

  11. As time went on … • WATS 9-11 inc. had an automated student submission sheet. • Unfortunately the timing of the development and server security issues did not allow widespread deployment. We had to settle for loading on one pc only. • Students were issued passwords and were only allowed one submission per WATS. A plea to their good nature not to go looking for things to delete also helped!

  12. Why did we choose this approach? • Did not suffer the more obvious potential drawbacks of a MCQ CAA approach. i.e. • Answers for Q1 are not all the same! • Fairness of test/equality of questions in a question bank. • Does not tie students to a PC. • Does not inadvertently create bias. • Does not inadvertently give hints. • A ‘chance’ answer is not an issue.

  13. Key benefits. • Stops solution sharing at source. • Students could still help each other. • Students get the feedback they so often like. • Forces a structured study pattern. • ‘Not cool to be studious’ is not an issue with this approach. • Attempts to engage with everybody. • Allows students to see where they went wrong.

  14. Critical success factors. • Attendance at lectures and tutorials was improved. • More tutorial questions were tackled by the students. • But what about exam performance ?

  15. WATS & exam performance.

  16. Overlaying the WATS & exam data

  17. 2002 vs. 2003 exam performance.

  18. 2002 vs. 2003 exam performance.

  19. Can WATS predict exam grade? 52% of the students had a difference of only 15% or less between their WATS and their exam grade.

  20. Correlating the exam to the WATS.

  21. What did the students think (1/7)? • +1.1

  22. What did the students think (2/7)? +0.9

  23. What did the students think (3/7)? +1.42

  24. What did the students think (4/7)? +0.15

  25. What did the students think (5/7)? -0.09

  26. What did the students think (6/7)? -0.61

  27. What did the students think (7/7)? +0.8

  28. Where next ? • Build on lessons with greater emphasis on automating more of the processes. • There now exists a one stop C++ program for entry to the WATS submissions.

  29. More where next ? • More analysis of results • Incorporate automated nudges. • Provides additional student care and individualised contact. • Consider adopting a competence pass threshold. • May help close the learning cycle. • Provide student unique additional study material. • Match material to individual weakness.

  30. Why automate? • Reduce staff time. • Marking rules can be set up and applied to all students. No matter how fair! • Will help with the move towards a competence pass structure. • Allows implementation without becoming too time consuming. Approach is already likely to be borrowed by an electrical science module.

  31. Conclusions. • The WATS has improved the exam performance. • The WATS has improved attendance. • The WATS will help with student retention. • This WATS approach would not have been feasible without exploiting the use of computers.

  32. Conclusions. • The students liked the whole experience • We will be looking to export this approach to other modules. • There are still some outstanding issues to investigate. • The application of CAA to this module has been a remarkable success.

  33. Acknowledgements. • The authors wish to thank the LTSN engineering for their support of this work.

More Related