1 / 26

ICT in Assessment and Learning

ICT in Assessment and Learning. Developments from the Enigma Project Robert Harding Nick Raikes ITAL Unit Interactive Technologies in Assessment and Learning. Introduction. ICT-led innovation: Exciting image… …but assessment is Cinderella! Holistic nature of ‘The Learning System’ :

mauve
Download Presentation

ICT in Assessment and Learning

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ICT in Assessment and Learning Developments from the Enigma Project Robert HardingNick Raikes ITAL UnitInteractive Technologies in Assessment and Learning SCROLLA Symposium 6 Feb 2002

  2. Introduction • ICT-led innovation: • Exciting image… • …but assessment is Cinderella! • Holistic nature of ‘The Learning System’ : • Assessment is a goal for teachers and learners… • …but valid assessment must be rooted in learning • How will new methods fit with existing practice? • The “Three-legged race”. SCROLLA Symposium 6 Feb 2002

  3. Outline • The Enigma Project trials • Scope • User interface • Evaluation: • ‘Traditional’ style questions (conceptual) • ‘New interactive’ style questions (analytical) • Administrative issues • Developments triggered as a result • Brief resumé of some resulting work • Closer look at one aspect: • on-screen marking • Conclusion. SCROLLA Symposium 6 Feb 2002

  4. The Enigma Project- Scope • Two years, 3+2 schools, about 170+120 pupils • Pilot 1: O-Level Physics examination in paper and CBT forms • A few graphics manipulation items • Pilot 2: Selection of O-Level Science in CBT form • Conceptual ‘objective’ items • Analytical simulation+free text answer items • Marking: • Objective/multiple choice: automatic • Open answers: printed out, marked by hand • Comparative evaluation of pupils performance. SCROLLA Symposium 6 Feb 2002

  5. User Interface • Navigation • Time count-down • Progress indicator • Candidate bookmark • Audio feature for questions. SCROLLA Symposium 6 Feb 2002

  6. User Interface • Navigation • Time count-down • Progress indicator • Candidate bookmark • Audio feature for questions. SCROLLA Symposium 6 Feb 2002

  7. Evaluation - ‘traditional’ • Pilot 1: the evaluators concluded: • For MCQ’s, no obvious differences between groups (3%) • For open-ended, those on paper theory Q’s better than CBT (11%) • Pilot 2: poor correlation between pilot and real examinations: • Was computer aptitude responsible? • Was it mismatch of trial vs real examination?. SCROLLA Symposium 6 Feb 2002

  8. Evaluation - ‘analytical’ • Objective - how do students react to ICT oriented question styles? • Use of simulations • 6 Q’s: 2 each Physics, Chemistry, Biology • Panel of examiners • Typical example follows …note: • access to simulation, free text response boxes • ability to record student actions. SCROLLA Symposium 6 Feb 2002

  9. Evaluation - ‘analytical’ • Student responses: • Only 1 in 6 failed to understand what was required of them… • …but 40% said ‘occasions when unsure what to do’.(but how true of most of us in many situations!) • Poor correlations between this test and ‘real’ examination • BUT … 70% said: • “computer based tests are as fair as conventional tests” • Computer literacy cited as most common reason for bias. SCROLLA Symposium 6 Feb 2002

  10. Evaluation - overall • 40% students said the computer slowed them down: • Interpretation: they are unfamiliar with computers in schoolwork • Conclude that learning and assessment must be integrated • 40% said computer stopped them answering as wanted • Most common complaint - not enough space to write • Did not like not being allowed to go back and change answer • 70% thought CBT’s to be as fair as paper-based • Reasons why not so fair: typing speed and literacy (lack of). SCROLLA Symposium 6 Feb 2002

  11. Administrative issues • Schools’ normal IT systems can interact with assessment software • Individual system crashes - incremental backup essential • High level of expert attention was needed • Is printing needed? • Security - visibility of screens • At least one candidate observed using email! • Issues of accessibility to other software and data • Interference with normal working must be minimised. SCROLLA Symposium 6 Feb 2002

  12. What developed? - Analysis • We have not seen a surge of CBT use in schools: • Material circumstances and resources • Holistic nature of changes needed • Cast of influences • Can you ‘box off’ summative assessment? • Traditional links between assessment and teaching • Examiner-teachers set questions, not Boards • Feedback loops: • Learning process geared to passing examinations • Examinations are rooted in ‘educational ambience’. SCROLLA Symposium 6 Feb 2002

  13. What we did - ‘ambience’ • Teacher support for using ICT - TEEM • http://www.teem.org.uk/ • Syllabus support • e-Lists and electronic communities • Electronic learning resources:e.g. Enigma simulations on the www • http://www.cie.org.uk/learning/science_practicals/ . SCROLLA Symposium 6 Feb 2002

  14. What we did - assessment • CALM / CUE / EQL collaboration • MEI A-Level Maths examinations • IMS standards and ‘QTI’ workgroup • Question test interoperability • Standards for Examination conduct • On-screen marking. SCROLLA Symposium 6 Feb 2002

  15. On-screen marking • Scan scripts, or capture electronic scripts • What do we want it for? • Faster, more flexible marking, management • More efficient quality control • Better feedback to Centres • Transition to on-line assessment. SCROLLA Symposium 6 Feb 2002

  16. Study - ‘proof of concept’ • Actual, live scripts:: • O-Level: Maths (350) • A-Level: Geography (900), Eng Lit (300) • 5 examiners per subject • Conventional and screen marked • Whole script, marking by question • Download scripts via Internet. SCROLLA Symposium 6 Feb 2002

  17. Features of the software • Ticks and crosses • Anchored comments • Display control - e.g. zoom, script scroll • View marking guide • Navigation between question or script • Progress monitor and referal. SCROLLA Symposium 6 Feb 2002

  18. Examiners’ impressions • Generally managed the downloading • Scripts at least as legible as on paper • Most felt they gave same marks: • Exceptions in English Lit & Geography • ?’s on trial nature, annotation, whole script • Maths points re marking by question • All would mark on-screen again. SCROLLA Symposium 6 Feb 2002

  19. Analysis of marks • Maths component - consistency • Geography - mostly satisfactory: • one examiner more severe on screen • one consistent on paper but not on screen • English: • 2 more severe on screen • All more consistent on paper than screen • Possible link with ‘whole script’ judgement?. SCROLLA Symposium 6 Feb 2002

  20. Conclusions • Holistic nature of system • ‘Three-legged race’ - or more? • Central role of teachers • Challenge: • Integrate, make the technology invisible • Way forward - ‘open source’?. SCROLLA Symposium 6 Feb 2002

  21. Some URL’s and email • ITAL Unit • http://ital.ucles-red.cam.ac.uk/ • Teacher support for using ICT - TEEM • http://www.teem.org.uk/ • Enigma simulations on the www • http://www.cie.org.uk/learning/science_practicals/ • Robert Harding <R.D.Harding@ucles-red.cam.ac.uk> • Nick Raikes <n.raikes@ucles-red.cam.ac.uk> . SCROLLA Symposium 6 Feb 2002

More Related