1 / 26

Richard Walker & Gustav Delius 6th July, 2004

Integrating on-line assessment with class-based learning: a preliminary study of the AIM marking system. Richard Walker & Gustav Delius 6th July, 2004. Presentation Aims. 1. Assessment and mathematics 2. Computer algebra based assessment systems ( AiM )

meara
Download Presentation

Richard Walker & Gustav Delius 6th July, 2004

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Integrating on-line assessment with class-based learning: a preliminary study of the AIM marking system Richard Walker & Gustav Delius 6th July, 2004

  2. Presentation Aims 1.Assessment and mathematics 2.Computer algebra based assessment systems (AiM) 3.Rationale for AiM - York innovation 4. AiM implementation 5. Student and staff feedback (2003-04) 6. Challenges and future developments

  3. Assessment and mathematics (1) The majority of tasks may be classified as either : Lower order activities • Factual recall • Carry out routine calculation / algorithm • Classify some mathematical object • Interpret situation or answer Higher order activities 5. Prove, show, justify (general argument) 6. Extend a concept 7. Criticize a fallacy 8. Create an example (Sangwin, 2003)

  4. Assessment and mathematics (2) For higher level activities: • often no one correct method • no unique correct answer • solutions routine but time consuming to mark Opportunity: in some circumstances for marking to be performed by computer algebra systems

  5. Computer algebra based assessment systems • Advantages: - can handle questions with no unique answers (identifying algebraic equivalence) - questions can be arbitrarily randomised - can ask students to supply examples - can give arbitrarily detailed feedback - allows detailed analysis of student attempts • Disadvantages: - time-consuming to set up well - marking routines can have bugs

  6. Alice interactive Mathematics • Web based assessment system • Uses Maple (computer algebra) –checking for equivalence of answer / solution • System is free / open source • Working system at many universities (Birmingham 2000; York 2003)

  7. Rationale for AiM • Reduce amount of routine coursework marking – redeploy GTA markers for seminar teaching • Reduce waiting time between coursework submission and marking / feedback • Get students to practice – focus on accuracy and reflect on solutions (opportunity to resubmit) • Give students challenges – exemplifying concepts • Encourage collaboration without copying (randomised questions)

  8. The York innovation • Integrate AiM questions with traditional homework questions (40% over range of courses: Calculus, Matrices etc.) • Students continue to receive problem sheets (randomised) to work on at home • Marks for all assigned problems are collated and displayed on Moodle

  9. AiM implementation • 1st year students (n=182) • Introductory session to AiM (October 2003) • Range of modules over two terms (2003-04): Calculus, Maple, Matrices • Accounting for 40% of coursework, but not final assessment • 10% penalty per wrong answer

  10. Student expectations: results of October 2003 survey • RR 74% (134/182) • computerized marking will have positive impact on maths education (A54%;D12%) • will be motivated to try again if answer wrong (A69%;D7%) • immediate feedback will encourage peer discussion of solutions (A55%;D8%) • feedback will help better prep for seminars/lectures (A72%;D6%)

  11. Student experience: results of March 2004 survey (1) • RR 54% (98/182) • computerized marking is relevant to maths education (A62%; D8%) • complemented trad class-based teaching methods (A67%; D6%) • class attendance less importance (A13%; D74%) • feedback encouraged students to reattempt questions (learn from mistakes) (A86%; D4%) • feedback encouraged reflection on solutions (where I went wrong) (A69%; D11%) • feedback encouraged peer-based discussion of solutions /study methods (A52%; D23%) • marking frustrated students: highlighted errors in work, but not reasons for mistakes (A57%; D16%)

  12. Student experience: results of March 2004 survey (2) • Convenience, ease of use and immediacy of feedback:“I can study in my own time. Immediate feedback on my performance is very useful” “AiM is extremely easy to access and the quick response to questions makes it very quick to know whether an answer is right or wrong.” • Peer-based collaboration: “Randomisation of problems makes it possible to work with peers to find the way through a problem, then complete it on your own. All in all a fantastic system with an intuitive and efficient front end.” • Frustration with system glitches:“Very good to have immediate feedback . Not good when there are faults when there are faults in the system and points are deducted for giving correct answers. This throws doubts on the reliability of the marking.”

  13. Student experience: results of March 2004 survey (3) • Method or solution?“What is frustrating is that there are no marks for method which is especially annoying when the calculation involves a lot of algebra..” “You get penalised for absentmindedness where if it was marked on paper the marker would see it was a trivial error.” • More guidance:“..if an attempt is incorrect absence of guidance as to what is wrong can be frustrating. It’s impossible to know whether the answer is close or completely wrong.” “A hint button might be nice, available when you have made so many failed attempts. This would students who can’t do a question can learn how to do it before a deadline, encouraging them to work more.”

  14. Staff observations on AiM (1) • too early to judge impact on student learning • teething problems: crafting of questions, anticipation of student entries • no evidence to suggest positive effect on class participation • but students will catch their own errors - be accurate & more secure: “Even 3rd yr students when they leave are very capable technically but are not so capable knowing if they have done something correctly. They need reassurance. AiM might help us in this respect. This problem has vexed us for as long as I can remember. There is a tendency among students to want more and more information - spoon-feeding. The weaning process gets harder and harder.”

  15. Staff observations on AiM (2) • evidence of shift in student interaction patterns • peer-based problem solving - posting problems / solutions via forum • increased interaction with lecturers on hmk: email rather than office hours (scaling up risk) • some student dissatisfaction (particularly weaker):“should be getting more of marks for knowing what to do, rather than how to do it accurately” • and frustration: answers marked wrong; mistyping formula / syntax • danger of over-dependence on system / laziness: ”Students encouraged only to make a half decent try, punching in answer and getting feedback. They should be thinking before they submit an answer.”

  16. Aim for AiM • Style of questions so far emphasises accuracy rather than self-learning “matrix manipulation is part of the language, but not the poetry of maths” • Development of system / feedback to point out conceptual errors • Challenge – to entice thinking – not training “there is a risk that students will become technically competent, but not innovative and creative” “ maths teaching is not in the business of drill, but is all about exemplifying concepts, giving students challenges as well as opportunities to practice”

  17. The Future • Computer Algebra Based Learning and Evaluation System (Naismith & Sangwin, 2004) - open source infrastructure for marking mathematical learning objects • JISC project collaboration: authoring tools for creation of assessment equations, taking account of user preferences and accessibility. - partners: Sheffield, Birmingham, Durham, Edinburgh, Imperial (London)

  18. References • http://aiminfonet.net • York and AiM (ALTC): http://maths.york.ac.uk/moodle/yorkmoodle/course/

More Related