1 / 23

Centre for Open Learning of Mathematics, Science Computing and Technology (COLMSCT)

Using e-assessment to support distance learners of science Sally Jordan and Philip Butcher GIREP-EPEC-PHEC 2009. Centre for Open Learning of Mathematics, Science Computing and Technology (COLMSCT). My plan. Two presentations describing work in e-assessment The context – the UK Open University

noleta
Download Presentation

Centre for Open Learning of Mathematics, Science Computing and Technology (COLMSCT)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Using e-assessment to support distance learners of scienceSally Jordan and Philip ButcherGIREP-EPEC-PHEC 2009 Centre for Open Learning of Mathematics, Science Computing and Technology (COLMSCT)

  2. My plan • Two presentations describing work in e-assessment • The context – the UK Open University • Why e-assessment? • Some examples of our use of e-assessment • Evaluation • And then a more specific example of our work…

  3. The UK Open University • Founded in 1969; • Supported distance learning; • 150,000 students, mostly studying part-time; • Undergraduate courses are completely open entry, so students have a wide range of previous qualifications; • Normal age range from 18 to ?? • 10,000 of our students have declared a disability of some sort; • 25,000 of our students live outside the UK.

  4. Implications for assessment • Within the Open University context, learners are geographically separated and we cannot assume that they will meet their tutor in order to receive feedback. • We are seeking to provide students with feedback on e-assessment tasks which is personalised and received in time to be used in future learning. • We are using small regular e-assessment tasks to help students to pace their study. • We are also using e-assessment tasks to encourage students to reflect on their learning and to enter into informed discussion with their tutor. • We are using e-assessment tasks in a range of ways e.g. summative, formative-only, diagnostic

  5. The OpenMark system • Uses a range of question types, going far beyond what is possible with multiple choice; • Question types include: numerical input, text input, drag and drop, hotspot; • Students are allowed three attempts with an increasing amount of teaching guidance, wherever possible tailored to the student’s previous incorrect answer; • Different students receive variants of each question so each has a unique assignment. • OpenMark has been incorporated into Moodle, the open source virtual learning environment being used by the Open University.

  6. Embedding iCMAs (interactive computer marked assignments) – an example • The assessment strategy for S104 : Exploring Science includes 8 TMAs (tutor-marked assignments), 9 iCMAs and a written End of Course Assignment • Home experiments, DVD activities, web-based activities and contributions to online tutor group forums are assessed, as is reflection on learning and on previously provided feedback. • Integration is key: are we talking about assessment or learning? • iCMAs are credit bearing (summative) but low stakes. Is this the best approach?

  7. Evaluation methodologies for iCMAs • Ask students • Observe students • Analyse data • Ask students • Observe students

  8. The current project.. • Is at the data analysis stage • Looking at use of iCMAs made by students in a wide range of different settings: Range: S103 (e-assessment was an added extra) to S151(ECA is an iCMA); Summative: S104, S154, SDK125 (iCMAs embedded within assessment strategy; low stakes summative); Formative: S279, S342, practice iCMAs for S154, SDK125, S151; Maths Skills Questions; Thresholded: new Physics and Astronomy courses; SM358 is formative-only, but with a ‘carrot’; Diagnostic: ‘Are you ready for?’ quizzes.

  9. How many students attempt each question? (summative)

  10. How many students attempt each question? (formative)

  11. Use of feedback…formative-only use

  12. Use of feedback…summative use

  13. Conclusions • Students appear to engage with summative iCMA questions at a deeper level than when they are in formative-only use; • However there are issues, especially preoccupation with the minutiae of grading; • Does the answer lie in thresholding or in formative-only use with a ‘carrot’?; • Some of the findings are unexpected; we are currently investigating the reasons for these.

  14. Some more surprises… • S104 students are more likely than others to look at all the questions before attempting any. Why? • There have been some surprises in looking at actual student responses (more later..); • It appears that student perception of what they are ‘meant’ to do is a very strong driver; • And other course components have a part to play…

  15. The impact of an end of course examination on iCMA use

  16. Acknowledgments • Funding from COLMSCT and piCETL; • The assistance of many people associated with COLMSCT and piCETL, especially Spencer Harben and Richard Jordan; • The co-operation of the course teams involved in the investigation.

  17. Sally JordanOpenCETL The Open UniversityWalton HallMilton KeynesMK7 6AAs.e.jordan@open.ac.uk http://www.open.ac.uk/colmsct/projects/icma http://www.open.ac.uk/picetl/ http://www.open.ac.uk/openmarkexamples/

More Related