1 / 8

Algorithmic e-Assessment with DEWIS

Algorithmic e-Assessment with DEWIS. Dr Rhys Gwynllyw Dr Karen Henderson Senior Lecturers in Mathematics and UWE Learning and Teaching Fellows Department of Engineering Design & Mathematics, UWE, Bristol. DEWIS e-Assessment system

lotta
Download Presentation

Algorithmic e-Assessment with DEWIS

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Algorithmic e-Assessment with DEWIS Dr Rhys Gwynllyw Dr Karen Henderson Senior Lecturers in Mathematics and UWE Learning and Teaching Fellows Department of Engineering Design & Mathematics, UWE, Bristol.

  2. DEWISe-Assessment system • Designed and developed at UWE, first implemented in 2007 and it is supported by the university. A completely stand-alone web based system for both summative and formative assessments. • Motivated by problems encountered with other e-assessment systems (licence and version problems, lack of support, fragile, inflexible, stressful, ‘have another go’ culture). • Primarily designed for numerate e-assessments; current usage in the fields of Business, Computer Science, Nursing, Engineering, Mathematics and Statistics. • UWE. • Satellite Colleges in the SW of England, Sri Lanka, Malaysia, Nepal. • Leeds University (Mathematics). • Mathcentre • Open Source • For more information, see www.cems.uwe.ac.uk/dewis

  3. Design Criteria • Completely stand-alone. Independent of commercial software due to previous difficulties with licences, support and version updates. However, includes student access using institution access details. • Fully algorithmic in all the following: • Question parameter generation – including reverse engineering and (constrained) random variables. • Marking (allows for ‘intelligent marking’) • Feedback • Assessment performance analysis • Interacts with other computer systems/languages (R, Python) 3. Loss-less data collection. Easy and direct access to this data through a comprehensive management system. • Academics able to design, develop and own their own questions. • Question Types: Numerical, algebraic expressions, multiple choice, multiple selection, text, matrix, graphical. Compound questions.

  4. Intelligent Marking • Made possible by the algorithmic approach to marking/feedback together with loss-less data collection. Fairer to the student and is much more representative of how a human would mark. (Essential for larger compound questions with multiple coupled inputs). • Common Student Errors (CSE) / Partial CreditThe marking recognises common student errors. In some cases, credit may be given if the answer is ‘partially’ correct. E.g. student supplies an angle in radians as opposed to the required degrees. Whether the CSE is credited or not with marks, the triggering of this error should be fed into the feedback supplied to the student. • Continuation/Follow-on Marking Where subsequent answers depend on previous ones. • Verification Marking For questions with non-unique solutions. Student answers are verified against a necessary and sufficient condition for correctness. E.g. Obtain a vector orthogonal to another vector, select a line on a graph. Also applies to algebraic inputs and many other scenarios. • Retrospective-Marking Particularly useful in new and/or large assessments. Facilitates the analysis of the students’ performance way beyond the study of ‘marks scored’ and ‘meta-data’. Analysis can highlight new ‘common student errors’ which can be fed into existing assessments to alter feedback. Uses DEWIS’ loss-less data featureand the extensive Reporter. • Staged Assessments The assessment is in stages. E.g. ‘progression to’ or ‘difficulty of’ the next stage dependent on performance in previous stage(s). (Marking communicates with assessment generator).

  5. Large Compound Questions • Distinct from e-Assessments with bite-sized questions. • Questions involves multiple, coupled questions. Typically requires intelligent marking. • Option of the single assessment being partitioned into stages. • Applied to assessments in Business Studies, Engineering, ORand Statistics.

  6. Effect of using intelligent marking • 145 (out of 207) students triggered at least one follow-on/partial marking process. • Disabling of the intelligent marking simulates the marking process of non-algorithmic e-assessment. Distribution of the percentage differences in students' marks, comparing intelligent versus non-intelligent marking. The few students that benefited by more than 30% typically made an error early on in using incorrect units.

  7. Lossless Data • Extensive Reporter • Academic has access to all data regarding every assessment. • Student has access to all previous attempts’ data. • Retrospective Marking • Every assessment can be marked retrospectively. • Example Reasons: • Post-assessment evaluation. • Re-evaluation of marking/feedback algorithm. • Ooops! A bug in the question.

  8. Finally • Work in progress: • Engineering diagnostic tests. • More statistics questions using ‘R’. • Documentation refresh. If you are interested in using DEWIS then please contact: rhys.gwynllyw@uwe.ac.uk System development and deployment karen.henderson@uwe.ac.uk Question bank development and assessment deployment or visit the welcome page: www.cems.uwe.ac.uk/dewis

More Related