1 / 25

E-assessment: a risk-based approach to success

E-assessment: a risk-based approach to success. Dr Chris Ricketts, Sub-Dean (Teaching Enhancement), Faculty of Technology, University of Plymouth and Director of Assessment, Peninsula College of Medicine and Dentistry. Overview. history institutional strategy risk analysis

palti
Download Presentation

E-assessment: a risk-based approach to success

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. E-assessment: a risk-based approach to success Dr Chris Ricketts, Sub-Dean (Teaching Enhancement), Faculty of Technology, University of Plymouth and Director of Assessment, Peninsula College of Medicine and Dentistry

  2. Overview • history • institutional strategy • risk analysis • examples of success • discussion

  3. History (1) • over 20,000 students • 6 faculties • 6 sites over 150 miles • some use of CAA on local servers

  4. History (2) • Needs analysis in April 1999 • 12 staff, 5 faculties, over 3,000 students • Strategic introduction of CAA(University-wide)

  5. Strategy (1) Zakrzewski’s (1999) ‘Catherine-wheel’ model • Module > Department > Faculty > University We chose University-wide availability, but • controlled pilot use in modules • undertook risk analysis • roll-out to University

  6. Strategy (2) What to pilot? • formative assessment • summative (in-course) assessment • end of year examinations • a variety of subject areas

  7. Strategy (3) • Decision 1: A steering group which involved staff from all affected parts of the University • Decision 2: Academic Board support • Decision 3: Staff training and support • Decision 4: Easy introduction to students • Decision 5: Get student feedback - and act on it

  8. Strategy (4) Decision 6: the big one! • Don’t treat computer aided assessments differently • management • quality processes • security

  9. General experience • bed of roses?…not • we did prepare carefully and try to consider all risks

  10. Risk analysis (Ricketts & Zakrzewski, AEHE 2005) Types of risk • Pedagogic (P) • Operational (O) • Technical (T) • Web-based (W) • Financial (F)

  11. Risk analysis • Define the risks • Estimate likelihood • Estimate severity (who is affected?) • Concentrate on the severe problems • How to avoid • What if it happens?

  12. Defining the risks • The literature can help, but… … these need to be YOUR risks, not someone else’s • Estimate likelihood • Estimate severity (who is affected?) • Concentrate on the real problems • How to avoid • What if it happens?

  13. Risk example 1 • P1: Assessment method not integrated into the curriculum • Likelihood? M • Who affected? Students, Academic staff • How much? Module • When? Before assessment This was not a high severity risk for us.

  14. Risk example 2 • O7: Module size too large for number of workstations available • Likelihood? H • Who affected? Students, Academic staff, Support staff • How much? University • When? Before assessment This was the most high severity risk for us.

  15. Risk management example O17: Different invigilation requirements for CAA not recognized • Liaise with examinations office • Produce guidelines for invigilators • Ensure support team in exam includes technical support staff • Academic staff to be present at start of on-line examination

  16. Issues • load testing the system for large-scalesummative assessments • link between student records and assessment databases • student computing mistakes • adequate computer facilities on all sites • wayward staff

  17. Some findings (from Roy Lowry) • Introduction of formative tests each week after lecture • Students are motivated enough to use the system • Although requiring some effort to set up, the system can be re-used • CAA in a formative mode has a significant impact upon learning

  18. Next step • Expanded to cover all material • Use of MC, MR, drop-down boxes and numeric questions • Used for the end-of-module test • 103 students in Babbage open access area • Students obtained their marks immediatelyafter they finished the test

  19. Student use • 30% of students attempted all of the tests(average mark: 72%) • 65% attempted some of the tests(average mark: 53%) • 5% did not use the system(average mark: 45%)

  20. Benefits Benefits for staff... • easier to give frequent feedback • no marking! Benefits for students… • more frequent feedback • instant marking • more self-assessment

  21. “Cost effectiveness” • Thanks to James Wisdom (ELEN Conference 2000) for the following

  22. Cost benefit analysis • Useful when benefits can be expressed in monetary termseg.Saving in staff time (hence money)etc. • Is this why we use CAA?

  23. Cost effectiveness • Useful when outcomes cannot be expressed in monetary terms • looks at outcomes in relation to goal “on time, on budget, to quality” • Is this why we use CAA?

  24. Pedagogic effectiveness • Learners learning,and learning better • Must be part of cost effectiveness • Is this why we use CAA??

  25. Questions • Over to you

More Related