1 / 33

m-learning – Evaluating the Effectiveness and the Cost

m-learning – Evaluating the Effectiveness and the Cost. John Traxler National ICT Research Centre. Other presentations. Developer’s sessions: Geoff Stead + Jo Colley Monday 12:00 Combined session on content and technology Research Papers: John Traxler Monday 14:30

jenski
Download Presentation

m-learning – Evaluating the Effectiveness and the Cost

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. m-learning – Evaluating the Effectiveness and the Cost John Traxler National ICT Research Centre

  2. Other presentations Developer’s sessions: • Geoff Stead + Jo Colley Monday 12:00 Combined session on content and technology Research Papers: • John Traxler Monday 14:30 Evaluating effectiveness and cost • Alice Mitchell + Kris Popat Monday 16:00 The potential of games • Jill Attewell + Carol Savill-Smith Tuesday 14:45 Focus on learners and learning

  3. Understanding Cost is Important • Profitability, Return on Investment • for universities, colleges, trainers, schools • The transition from print-based learning • perceived economies of scale, possible large up-front costs, increased risks • globalisation; competition; industrialisation • increased training and staff development • learning delivered by tools and teams • quality, accountability, visibility • changes in working practices

  4. Outline of Talk • Predicting the Costs of Software • Predicting the Costs of Educational Multimedia • Predicting the Costs of m-learning • Matching the Costs of m-learning to the Benefits • The Real Difficulties

  5. Software Cost-Estimation • Many different approaches, most dependent on local technology, history, environment; modest successes • Attempts to calculate effort sometime before implementation and delivery • Often based on some measure of program size e.g. KLOC or Function Points

  6. Basic COCOMO • Basic version: effort depends of program size • E = ab(KLOC) exp(bb) • D= cb(KLOC) exp(db) • N, number of people, =E/D

  7. Basic COCOMO Project ab bb cb db Type Organic 2.4 1.05 2.5 0.38 Semi 3.0 1.12 2.5 0.35 detached Embedded 3.6 1.20 2.5 0.32

  8. Intermediate COCOMO • Intermediate version : effort depends on program size and “cost-drivers”, each on 6-point scale from “very low” to “extra high“ • product attributes • reliability complexity etc • hardware attributes • performance memory etc • personnel attributes • experience capability etc • project attributes • tools methods etc

  9. Educational Software Costs - “rules of thumb” • Nothing useful will happen in the first three to six months after you decide to go with[courseware]. It doesn’t make any difference whether you go with a vendor or start producing your own in-house (Lee & Zemke 1987) • The first course produced by a new [courseware] development group will be a collection of mistakes. Throw it away. (Lee and Zemke 1987) • Where team members are not used to working together or are geographically apart, add 10-15% [to the total effort] (Casey et al 1988) • Analysis and design comprise 50% of total effort (Casey et al. 1988) • Even a skilled Instructional Design author will revise plus or minus half the material after first or second draft (and then 20-25% in the third draft) (Casey et al. 1988) -cited by Marshall

  10. Some Industry Figures

  11. Multimedia Educational Software Cost-Estimation - Marshall • COCOMO used on 14 projects in 1990’s • mainly 1 hour learner time (cf. KLOC) • 4 potential cost-drivers (24 sub-heads) • course difficulty • development environment • subject expertise • interactivity • Ian Marshall, Abertay University

  12. Multimedia Educational Software Cost-Estimation - Marshall • Significant cost-drivers so far • development environment • instructional design method (+ 4 more) • course difficulty • number of objectives • level of objectives • existing course material

  13. Gagne’s Media-Mix

  14. Laurillard’s Model or Framework delivery descriptions(theory) discussion adaptation reflection teacher’s world actions(practice) interaction

  15. print, lecture, Web pages delivery theory teacher’s world practice

  16. seminar, conferencing theory discussion teacher’s world practice

  17. laboratories, fieldwork theory reflection? teacher’s world interaction practice

  18. teaching package delivery theory discussion ? adaptation? teacher’s world practice

  19. VLE courses theory course, noticeboard group folder, assignments, chat, forum, e-mail chat apply theory discuss tasks agree goals, do tasks work shops set tasks practice feedback, experience

  20. m-learning courses theory iPAQ mPortal, SMS mPortal apply theory discuss tasks iPAQ exercises develop exercises practice SMS, mPortal

  21. Practical Strategies • Course Resource Appraisal Model • implemented as Excel spreadsheet • Open University course resource planning and management tool • based on Laurillard’s work • looks at student workload and author workload across all media options • Media Advisor • also based on Laurillard’s work but simplified • developed at UNL by Martin Oliver & Grainne Connole • aimed at individual lecturers to be used iteratively • in public domain, on CD

  22. Media Advisor

  23. Media Advisor

  24. m-learning in an Imperfect World Some of the Real and Hidden Costs

  25. Constraints to m-learning • Hands-on - computing, music-making, workshop • In vivo - medicine/dentistry/nursing/veterinary, field trips (perhaps) • Interpersonal - interview skills, presentations • Social - team-work (perhaps), business, marketing • Expressive - ballet, dance • Using Tools and Machines - engineering • Laboratory Use - science (perhaps) • ( and exams, assessments, vivas etc)

  26. Costs to m-learning Students • 75% of (undergraduate) students own PC, 29% on internet, nearly 100% have ‘mobiles, almost none have PDAs • time seen as main cost by (undergraduate) students (but is this location-dependent?)

  27. Costs to Students • Online learning(!) material was often printed • Text of practicals 33% • Discussion of practicals 31% • Web pages 45% • Conference messages 54% • PLUM Report No. 122 • Would m-learning students want to do the same or can they read from PDAs?

  28. “Which of these learning tools do you prefer to use?” books 67% lectures 36% videos 36% computers 19% Campaign for Learning, 1996 - predates PDAs! Why these preferences? What will be preferences of m-learning students? audio-tapes 11% internet 7% none/no pref. 3% none/don’t want to learn 3% Students’ Preferences

  29. Student Use of PC-based CMC • Limited active participation • participate : lurk • 40: 60 • 30: 70 • 10: 90 • varied strategies to improve these • based on postings to “teaching online” mailbase, 1999 • see Gilly Salmon’s work @OU • Can m-learning turn lurkers into participants?

  30. Costs to m-learning Students • Forms of disadvantage • dyslexia; visual impairment • not confident with IT or English • less affluent • “wrong” cognitive or learner style • New forms of disadvantage • bandwidth poverty • interface poverty (wrt PCs)

  31. Thanks for your time! My email John.Traxler@wlv.ac.uk Samples are at www.ctad.co.uk/m-learning m-learning project is at www.m-learning.org

More Related