1 / 15

Learning Technology & the Evaluation of Learning Outcomes

Learning Technology & the Evaluation of Learning Outcomes. Andrew Oliver. Evaluation Techniques. Educational effectiveness: ability to impart knowledge and understanding Techniques : Quantitative (pre / post testing) Qualitative (questionnaire, interview)

vinson
Download Presentation

Learning Technology & the Evaluation of Learning Outcomes

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Learning Technology & the Evaluation of Learning Outcomes Andrew Oliver

  2. Evaluation Techniques Educational effectiveness: ability to impart knowledge and understanding Techniques: • Quantitative (pre / post testing) • Qualitative (questionnaire, interview) The method used depends on the context Ideally one method supplements the other

  3. Pre & Post Testing When? • directly before and after (control group) • exam performance (compare with previous year ) But • time between exercise & assessment • knowledge gain through other methods, (books, tutorial, revision) • varying levels of validity

  4. Questionnaire - Usability Common usage: usability of the technology Example • Did you find it difficult to move from Section A to B? • Were the download times acceptable? • Did you enjoy the exercise? Relates to pedagogical design

  5. Questionnaire - Educational Rarely educational: student self rating (Likert) Example • By how much did you feel the application increased your knowledge • How much did you feel the Worked Examples helped your understanding of the topic Key word “feel” - accuracy & honesty

  6. Questionnaire Reality • mixture of Educational and Usability • impression of attitude towards technology • idea of the learning environment • hence context in which the application works best

  7. Problems, Problems Problems associated with each method Pre & Post Testing • Preparation time (new questions) • Timing • Co-operation (staff & student) • Not anecdotal • Control group (selection & reliability) • More time (interpretation)

  8. Problems, Problems Questionnaire • Data limitations (banding & differentiation) • Self rating (accuracy and honesty) Are students honest?….depends… • Anecdotal (poor trends)

  9. Yet more… Logistics • Timing (student workload, availability) – can you wait? • Resources (workstations, rooms, network) • Liaison (network admin, LRC) – cooperation

  10. Will it ever end? Does it do what the label says it does? • measure what its supposed to measure? (learning outcomes) • measured accurately? • any outcomes not measured? Time needed re design the test items (Bloom’s taxonomy)

  11. Context of Evaluation The intended context of use Stand alone resource: • Evaluate immediately after exercise • Pre & post test • Control group Integrative approach: • Questionnaire

  12. Context of Evaluation Relate the context of the evaluation to the intended educational setting of the technology

  13. So Which Method’s Best? (or what can you realistically expect at the UH?) Ideal • pick n mix approach – one method augments the other • but time required (design, analysis) Realistic • questionnaire (least time, user friendly, mix of quant/ qual) • determine attitude towards IT and logistical problems – curriculum planning

  14. Ways around it (what you have to do to get what you want ) • pre / post test: disguise the exercise pre & post testing and questionnaire • bribery: marks awarded for attendance • questionnaires: bribery or ambush or both Finally • no one method gives the complete answer • use a variety and your experience / intuition

  15. LTDU web resources / references • Questionnaire item database • LTDU website: http://www.herts.ac.uk/ltdu References: • Bloom, B. S. (ed.), 1956. Taxonomy of Educational Objectives, Handbook 1: Cognitive Domain, David McKay Co: New York • Gronlund, N. E., 1971. Measurement and Evaluation in Teaching, Macmillan Co: New York • Laurillard, D., 1993. Rethinking University Education, Routledge: London

More Related