1 / 30

Evaluation Methods ( for Mobile Learning Research) and Sample of Research Roadmap

Evaluation Methods ( for Mobile Learning Research) and Sample of Research Roadmap. Prof. Dr.-Ing. Kalamullah Ramli, M. Eng. About Me. National Reviewer for Hibah Riset Kemitraan PT, Industri dan Pemda Riset Unggulan Perguruan Tinggi dan Industri (RAPID)

jeb
Download Presentation

Evaluation Methods ( for Mobile Learning Research) and Sample of Research Roadmap

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluation Methods (for Mobile Learning Research) andSample of Research Roadmap Prof. Dr.-Ing. Kalamullah Ramli, M. Eng

  2. About Me • National Reviewer for • Hibah Riset Kemitraan PT, Industri dan Pemda • Riset Unggulan Perguruan Tinggi dan Industri (RAPID) • Penelitian Unggulan Strategis National (PUSNAS) • Penelitian Unggulan Internasional • Technical Reviewer • International Journal on Learning • IEEE Malaysia International Conference on Communications (MICC) 2008 • IEEE Malaysia International Conference on Communications (MICC) 2009 • International Conference on Quality in Research (QiR)

  3. Instant Broadband Any where! Any time!

  4. Dense Wireless Data Network Increasing Demand in Mobile Data Access

  5. What is mobile learning? • Learning with portable technology • Focus on the technology • Could be in a fixed location, such as a classroom • Learning across contexts • Focus on the learner • Could use portable or fixed technology • How people learn across locations and transitions • Learning in a mobile world • Focus on the mobile society • How to understand people and technology in constant mobility • How to design learning for the mobile society

  6. Can mobile learning be effective? • We think so! • Classroom response systems • Group learning with wireless mobiles and phones • Classroom handheld simulation games • Mobile guides • Connecting learning in formal and informal settings • Lack of convincing studies of mobile learning • Attitude surveys and interviews: “they say they enjoy it” • Observations: “they look like they are learning” • With a few exceptions

  7. Issues in evaluating mobile learning • It may be mobile • Tracking activity across locations • It may be distributed • Multiple participants in different locations • It may be informal • How can we distinguish learning from other activities? • It may be extended • How can we evaluate long-term learning? • It may involve a variety of personal and institutional technologies • Mobile and fixed phones, desktop machines, laptops, public information systems • There may be specific ethical problems • How can and should we monitor everyday activity?

  8. What do you want to know? • Usability • Well-tested methods: • Expert evaluations (e.g. Heuristic evaluation and Cognitive Walkthrough) • Lab-based comparisons • Usefulness • Hard: depends on the educational aims and context • Field-based interviews, observations and walk-throughs • Ethnographic analysis • Critical incident studies (including focus group replay) • Learning outcome measures • Control group • Pre-test, intervention, post-test, delayed post-test • Logbooks and diaries • Logbooks of activity • Diary-diary interview used successfully for intensive study of everyday learning over time

  9. Some evaluation methods (contd.) • Usefulness (contd.) • Other feedback methods • Telephone probes • Snap polls • Interviews • Focus groups • Automatic logging • Recording where, when and how a mobile device is used • Quantitative analysis of student learning action (Trinder et al., 2005) • Learning outcome measures • Control group • Pre-test, intervention, post-test, delayed post-test • Attitude • Attitude surveys • General attitude surveys are little use: almost all innovations are rated between 3.5 and 4.5 on a 5 point Likert scale • Specific questions can indicate issues (e.g. interface problems) • Microsoft Desirability Toolkit • Users indicate their attitudes through choice of cards

  10. Case studies • Student Learning Organiser • Long term learning • MyArtSpace • Learning across contexts • PI: Personal Inquiry • Ethics

  11. Interactive Logbook projectCorlett, D., Sharples, M., Chan, T., Bull, S. (2005) Evaluation of a Mobile Learning Organiser for University Students, Journal of Computer Assisted Learning, 21, pp. 162-170. • 17 MSc Students, University of Birmingham • Academic year 2002-3 • Loaned iPAQ with wireless LAN for personal use • Learning organiser Time manager Course manager Communications Concept mapper • Standard tools Email Instant messenger Web browsing • Free to download further software from the web

  12. Evaluation methods • Questionnaires • administered at 1, 4, 16 weeks, and 10 months • Focus groups, following each of the questionnaires • Logbooks • Students kept logbooks for six weeks • Students’ attitudes towards the learning organiser • Patterns of usage of the various applications (including any they had downloaded themselves) • Patterns of usage of the technology, particularly with respect to wireless connectivity • Ease of use issues • Issues relating to institutional support for mobile learning devices • Videoed interactions • to compare the concept map tools, three students were videoed carrying out an exercise, which they later commented on after reviewing the video

  13. Data • Usability • Size, memory, battery life, speed, software usability, integration • Usefulness • of PDAs • of Learning Organiser • of concept mapping tools • Patterns of use • Locations • Changes over time

  14. Frequency of use

  15. Use of PDA in specific locationsRank order, for coursework, and in brackets for other activities

  16. Perceived usefulness of tools (“useful” or “very useful”)

  17. Perceived impact on activitiesNumber of students naming tool as having greatest impact

  18. Results • Some usability problems • Especially battery life • Most use of calendar, timetable and communications • PDA-optimised content was well used • Importance of connectivity • No clear demand for a specific “student learning organiser” • Concept mapping tools were not widely used • Not generally used while travelling • Ownership is important • Need for institutional support

  19. MyArtSpace • Service on mobile phones for enquiry-led museum learning • Aim to make school museum visits more engaging and educational • Students create their own interpretation of a museum visit which they explore back in the classroom • Learning through structured enquiry, exploration • Museum test sites • Urbis (Manchester) • The D-Day Museum (Portsmouth) • The Study Gallery of Modern Art (Poole) • About 3000 children during 2006

  20. How it works • In class before the visit, the teacher sets an inquiry topic • At the museum, children are loaned multimedia phones • Exhibits in the museum have 2-letter codes printed by them • Children can use the phone to • Type the code to ‘collect’ an object and see a presentation about it • Record sounds • Take photos • Make notes • See who else has ‘collected’ the object • All the information collected or created is sent automatically to a personal website showing a list of the items • The website provides a record of the child’s interpretation of the visit • In class after the visit, the children share the collected and recorded items and make them into presentations

  21. Lifecycle evaluation • Micro level: Usability issues • technology usability • individual and group activities • Meso level: Educational Issues • learning experience as a whole • classroom-museum-home continuity • critical incidents: learning breakthroughs and breakdowns • Macro level: Organisational Issues • effect on the educational practice for school museum visits • emergence of new practices • take-up and sustainability

  22. EvaluationAt each level • Step 1 – what was supposed to happen • pre-interviews with stakeholders (teachers, students, museum educators), • documents provided to support the visits • Step 2 – what actually happened • observer logs • post-focus groups • analysis of video diaries • Step 3 – differences between 1 & 2 • reflective interviews with stakeholders • critical incident analysis

  23. Summary of results • The technology worked • Photos, information on exhibits, notes, automatic sending to website • Minor usability problems • Students liked the ‘cool’ technology • Students enjoyed the experience more than their previous museum visit • The students indicated that the phones made the visit more interactive • Teachers were pleased that students engaged with the inquiry learning task

  24. Usability Issues • Appropriate form factor • Device is a mobile phone, not a typical handheld museum guide • Collecting and creating items was an easy and natural process • Mobilephone connection • Text annotations • Integration of website with commercial software, e.g. PowerPoint

  25. Educational Issues • Supports curriculum topics in literacy and media studies • Encourages meaningful and enjoyable pre- and post-visit lessons • Encourages children to make active choices in what is normally a passive experience • Teacher preparation • Need for teacher to understand the experience and run an appropriate pre-visit lesson • Where to impose constraints • Structure and restrict the collecting activity, or learn from organising the material back in the classroom • Support for collaborative learning • “X has also collected” wasn’t successful

  26. Summary of methods • Interactive logbook • Usability • Videoed interactions with comparative systems and reflective discussion • Usefulness • Questionnaires, focus groups, user logbooks • Attitude • Questionnaires • MyArtSpace • Usability • Heuristic evaluation • Usefulness • Structured interviews with stakeholders • Videotaped observations and notes, critical incident analysis • Focus group interviews with learners to discuss incidents • Attitude • Interviews with stakeholders • PI: Personal Inquiry • Still to be determined, but will include: stakeholder panels, videotaped observations and critical incident analysis, comparative tests of learning process and outcomes for selected tasks

  27. If time permits .....................

  28. Problems of Indonesian ICT Companies Limited availability of resources, especially financial capital, during the development/scale-up phase

  29. The Grand Strategy Innovation lifecycle: from basic idea to marketable products

  30. The Grand Strategy

More Related