1 / 20

Improving the Quality of Flexible Learning

EdTech 2009. Improving the Quality of Flexible Learning. Daire Ó Broin and Siobh án Clarke Distributed Systems Group TCD 21 st May 2009. Introduction. What is Flexible Learning enables learners to choose where, when, and how they learn. What is Quality

doris
Download Presentation

Improving the Quality of Flexible Learning

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. EdTech 2009 Improving the Quality of Flexible Learning Daire Ó Broin and Siobhán Clarke Distributed Systems Group TCD 21st May 2009

  2. Introduction • What is Flexible Learning • enables learners to choose where, when, and how they learn. • What is Quality • the “degree to which a set of inherent characteristics fulfils requirements” [ISO 9000:2005] • Improving quality of flexible learning • Different learners have different requirements • meet learners’ increasingly demanding requirements

  3. Introduction • Many desirable requirements stem from:content repository adapting to user context • want examples that explain material using subjects of interest to user. • want learning to be tailored to preferred learning style • want the experience to be enjoyable and motivating

  4. Outline The flow model An approach to creating the key conditions of flow An evaluation of the approach Conclusion and future work

  5. The Flow model • Flow • an immensely enjoyable mental state that is characterised by “a complete immersion in what one is doing" [Csikszentmihalyi, 1990] • 3 key conditions of flow: • challenging task that requires skills and person believes his skills match the challenges • clear goals • feedback

  6. Example of flow

  7. Research Question: how can we produce the 3 key conditions of flow? • Our approach: • Large repository of tasks • Model student skills • Recommend suitable tasks from the content repository based on skill model • Supply or enhance feedback • Sample application: Inka • A mobile teaching assistant tool for Java programming

  8. Recommender Systems • Recommender systems estimate ratings of items that have not been seen by a user • Approaches usually classified as: • collaborativee.g. Amazon • content-basede.g. online news sites • hybrid [Kim, 2006] [Kim, 2006]

  9. Task Recommender Systems • Recommender systems have been built to recommend a diverse range of items • Main difference for recommendations of learning tasks: • items such as movies are recommended based on assumption of shared taste (changes little over time) • recommendations of learning tasks based on many properties, some change rapidly, e.g. learner’s skills.

  10. Multi-criteria Recommendation • Most recommendation problems solved up to now have involved single criterion rating systems [Adomavicius and Kwon, 2007] • Multi-criteria imperative for recommending learning tasks (importance of context). • Three criteria • balance of skills and challenges • clear goal • Feedback • Calculate score for each single criterion problem and compute overall ratings.

  11. MCR Approach Overview • User gets a list of recommended tasks. • User does the task. • User rates the task along the three criteria. • Ratings used to improve performance.

  12. MCR Approach • Clear goals and feedback • Similar to taste • Can use collaborative approach • Challenges and skills • Skills can change quickly • Can’t use collaborative approach

  13. Challenges and skills • In flow experiments, usually done by asking a person to rate: • level of skills in the activity (0 to 9) • level of challenges in the activity (0 to 9) • Measuring skills and challenges in this way is ambiguous • not clear which specific challenges and skills are being measured

  14. Challenges and skills • Challenges/skills ratio measured indirectly. • Perception of balance of challenges and skills can be viewed as a person’s “confidence regarding what [he/she is] able to do in a situation” [Jackson and Eklund, 2004]

  15. Estimating challenges and skills • Domain expert indexes each task: vector of weights for each skill. • Modelling confidence of a skill: • Above graphs show confidence level • (1=definitely can’t do task, 5=definitely can)

  16. Estimating challenges and skills • Small section of Inka skill model; rate skill by rate confidence of set of tasks requiring the skill. • Task recommendation score calculated from the index of skills and student’s skill model

  17. Evaluation • Consumer recommender systems have several standardised data sets: e.g., Book-Crossing, EachMovie. • Datasets can be used to evaluate recommendation algorithms. • In Technology Enhanced Learning (TEL) no standardized data sets [Drachsler et al, 2009] • Ratings are context specific. • Evaluation small scale, but successful • TCD: 15 computer science students, 2 x 90 min sessions • flow score 95% confidence in [18.1,19.3]. Minimum flow score for 3 conditions to be present 16.

  18. Improving quality • As ratings are gathered, poorly rated tasks are flagged. • Clear goals and feedback: • given to content developer to improve • Skills/challenges • the task index is automatically modified. • simulations have shown effectiveness.

  19. Conclusion and future work • Requirement can be met in Java domain • Longitudinal study of Java domain • Develop applications for different domains • The abundance of content • Can the requirements of a particular flexible learner be met? • Searching repositories by context • Skills is one of the most important contexts, but: huge effort to index, agreement on decomposing domains.

  20. References • [Drachsler et al, 2009] Identifying the Goal, User model and Conditions of Recommender Systems for Formal and Informal Learning, Journal of Digital Information, Vol 10, No 2 (2009) • [Jackson and Eklund, 2004] The Flow Scales Manual, Fitness Information Technology, 2004. • [Kim, 2006] What is a Recommender System?, Proceedings of Recommenders06.com (pp 1-21), 2006

More Related