Adaptive accessible design as input for runtime personalization in standard-based eLearning scenarios - PowerPoint PPT Presentation

Slide1 l.jpg
Download
1 / 29

Adaptive accessible design as input for runtime personalization in standard-based eLearning scenarios Olga C. Santos , Jesús G. Boticario ocsantos@dia.uned.es – jgb@dia.uned.es ADDW 2008 – York, September 22-25 Technology is expected to attend the learning needs of students

I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.

Download Presentation

Adaptive accessible design as input for runtime personalization in standard-based eLearning scenarios

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Slide1 l.jpg

Adaptive accessible design as input for runtime personalization in standard-based eLearning scenarios

Olga C. Santos, Jesús G. Boticario

ocsantos@dia.uned.es – jgb@dia.uned.es

ADDW 2008 – York, September 22-25


Slide2 l.jpg

Technology is expected to attend the learning needs of students

in a personalised and inclusive way

following the lifelong learning paradigm

But…

very ofen technology is

inapropriate or introduced with insuficient support

Further exclusion for people with disabilities

EU4ALL (IST-2006-034778)


Meaning of disability l.jpg

Meaning of disability

“Learners experience a disabilitywhen there is a mismatchbetween the learner’s needs(or preferences) and the education or learning experience delivered”

  • ISO JTC1 SC36

    • Individualized Adaptanbility and Accessibility in eLearning, Education and Training


Our research goal l.jpg

Our research goal

  • Improve the learning efficiency

    • Task performance (speed)

    • Course outcomes (results)

    • User satisfaction


Improving learning experiences l.jpg

Universal design

Follow specifications

Accessible contents

W3C WAI WCAG

Learning paths for different learning needs

IMS-LD

Contents metadata

IEEE-LOM / IMS MD

User characterization

IMS-LIP, IMS-AccLIP, ISO PNP

Device capabilities

CC/PP

Personalization

AI techniques

Knowledge extracted from users’ interactions

Infer user features & preferences (user modelling)

Help manage the collaboration

Audit performance

Context-awareness

Recommender systems

Improving learning experiences

Runtime

Design

+

EU4ALL (IST-2006-034778) = aLFanet (IST-2001-33288) + inclusion


Outcomes from evaluations with users l.jpg

Outcomes from evaluations with users

Carried out in ALPE project (eTEN-029328)

Contents developed using the WCAG to suit end-users’ accessibility preferences

Dynamic support would have improved the learning performance and increased the learner’s satisfaction


The educational experience is holistic l.jpg

The educational experience is holistic

  • Provide accessible learning experiences

    • The learning path that the student chooses to follow should be accessible while individual online components or learning objects may not.

  • Rather than aiming to provide an e-learning resource which is accessible to everyone, resources should be tailored for the student’s particular needs

  • Although the WCAG guidelines can be used to “ensure” that learning objects are accessible this may not always be desirable from a pedagogic standpoint.


Dynamic support demanded on alpe l.jpg

Dynamic support demanded on ALPE

  • Need 1: Adapt the language used and offer glossaries that clarify terms (PREVIOUS KNOWLEDGE)

    • if the difficulty level of a particular content is high and the user has not passed the evaluation of the associated learning objective

       recommend more detailed content and a glossary with complex terms from the text

  • Need 2: Standing out what information is most important (INTEREST)

    • if the semantic density of a content is high

       alert the user of its relevance

  • Need 3: Suggest functionality from the browser (TECH. SUPPORT)

    • If user low experienced in the usage of Internet and uses screen-reader

       suggest and explain how to access abbreviations and acronyms

  • Need 4: Provide dynamic guide and embedded help (TECH. SUP.)

    • If technology level is low and new to the platform

       Explain how to navigate in the platform, how to use their user agents and provide technical assistance


Learning performance factors l.jpg

Learning performance Factors

  • Factors identified from brainstorming with psycho-pedagogical experts

    • Motivation for performing the tasks

    • Platform usage and technological support required

    • Collaboration with the class mates

    • Accessibility considerations when contributing

    • Learning styles adaptations

    • Previous knowledge assimilation


Our research goals l.jpg

Our research goals

  • Improve the learning efficiency

    • Task performance (speed)

    • Course outcomes (results)

    • User satisfaction

  • by offering the most appropriate recommendation in each situation in the course

    • get familiarized with the platform

    • get used to the operative framework of the course

    • carry out the course activities

  • addressing the required factors


  • Personalized content and service delivery l.jpg

    Personalized content and service delivery

    • Dynamic support in terms of recommendations which focus on the learning factors

      • Covers the learning needs of the learners and the current context alongthe learning process

      • Reduces the workload of the tutors

    • Based on a standard-based user model (IMS-LIP/AccLIP)

      • Demographic information

      • Learning styles

      • Technology level

      • Collaboration level

      • Interest level per learning objective

      • Knowledge level per learning objective

      • Accessibility preferences (display, control, selection)

      • Past interactions


    The a2m recommendation model l.jpg

    The A2M recommendation model

    Objectives:

    • Support the course designer in describing recommendations in inclusive eLearning scenarios

    • Manage additional information to be given to the user to explain why the recommendation has been offered

    • Obtain meaningful feedback from the user to improve the recommender

      Aims:

      • to be integrated in LMS with an accessible, usable and explicative GUI

      • with generality in mind to be adapted to other domains if useful


    A model for recommendations in lll l.jpg

    PREFS/CONTEXT

    fits in

    fulfills

    CONDITIONS

    TIMEOUT

    RESTRICTIONS

    offered

    applies

    limited by

    RECOMMENDATION

    CATEGORY

    TECHNIQUE

    belongs to

    generated by

    has

    ORIGIN

    EXPLANATION

    A model for Recommendations in LLL


    Factors categories l.jpg

    Factors  Categories

    • Motivation

    • Learning styles

    • Technical support

    • Previous knowledge

    • Collaboration

    • Interest

    • Accessibility

    • Scrutability


    Process l.jpg

    Process

    Runtime time

    Design time

    Human Expert

    USER (Learner/Tutor)

    static

    Rec. instances

    in the LMS

    =

    Recs.

    context

    Rec. types

    dynamic

    user

    device

    course

    Artificial Intelligence

    techiques


    Recommender user interface page 1 l.jpg

    Recommender User interface (page 1)

    If applicable, the recommendation is offered to the user in a usable and accessible user interface, together with a detailed explanation.


    Recommender user interface page 2 l.jpg

    Recommender User interface (page 2)

    Explanation page with additional information regarding the origin, category, technique and high level description

    Feedback requested from this page


    Small scale experience l.jpg

    Small-scale experience

    • Objective

      • Get feedback of the recommendation model

        • not to validate the generation of recommendations

    • Settings

      • Access to a course space in dotLRN LMS

      • 13 static recommendations available

    • Method

      • 30 questions test

        • Experience with eLearning platforms

        • Recommender output

        • Type of recommendations


    Slide19 l.jpg

    • 29 users from two summer courses

    • 16 valid responses:

      • 50% accessibility experts

      • 20% people with disabilities

      • 80% experience with web-based application for learning and teaching


    Experience with the platform l.jpg

    Experience with the platform

    • Perception

      • Very good: 18.75%

      • Good: 75%

      • Regular: 6.25 %

      • Bad or very bad: 0%

    • Compared to previous experiences

      • Better: 70%

      • Worst: 15%

      • Not Answered: 15%

      • Reasons:

        • Positive opinions:

          • WebCT was not friendly

          • this one adjusts to my learning style

          • this one presents an easier navigation

          • this one is more accessible

          • sections are clearly separated in this one

        • Negative opinion:

          • depends on the time spent to get used to the platform


    Recommender system output i l.jpg

    Recommender system output (I)

    • All users were aware the RS

    • None wanted to get rid of it

    • Positive feedback:

      • Very useful service: 56.25%

      • Another service of the platform: 43.75% (it is a demand from the users!)

    • Usage of icons

      • A third of students (31.25%) had not paid attention to them

      • For 2/3:

        • Useful and clear: 56.25%

        • Good idea but requiring a redesign: 12.5%

    • Origin of recommendations

      • Most liked to receive this info: 93.75%

      • Preferred origins:

        • recommended by the professor: 93.75%

        • adapted to my preferences: 68.75%

        • defined by the course design: 43.75%

        • useful for my classmates: 43.75%


    Recommender system output ii l.jpg

    Recommender system output (II)

    • Additional information page

      • Not accessed: 37.5%

      • Useful: 62.50%

      • Preferred information:

        • Detailed explanation: 66%

        • Category: 43.75%

        • Origin: 31.25%

        • Technique: 31.25%

    • Categories

      • No other category was identified.

      • Relevance:

        • Learning styles: 68.75%

        • Previous knowledge: 62.50%

        • Interest level: 56.25%

        • Motivation: 43.75%

        • Technical support: 31.25%

        • Scrutability: 31.25%

        • Accessibility: 31.25%

        • Collaboration: 25%


    Feedback on the type of recommendations l.jpg

    Feedback on the type of recommendations

    Learner point of view

    • Types of recommendations selected for more that 60% of the users:

      • Fill in a learning style questionnaire, so the system can be adapted to me

      • Read some section of the help, if there is a service in the platform that I don't know

      • Read a message in the forum that has information that may be relevant to me

      • Read a file uploaded by the professor or a classmate

      • Get alerts on deadlines to hand in an activity

    • Types selected by less than 25% of users:

      • Fill in a self-assessment questionnaire

      • Rate some contribution done by a learner

      • Access an external link of the platform

      • Messages without any action (e.g. motivational messages)

    • New suggested type of recommendation:

      • Recommend some aspects of the course that the user had not visited for a long time


    Feedback on the type of recommendations24 l.jpg

    Feedback on the type of recommendations

    From the professor point of view

    • Preferred information to define the recommendations:

      • learning styles: 62.50%

      • interest level in course objective: 62.50%

      • collaboration level: 56.25%

      • course features: 56.25%

      • actions already done by the user: 56.25%

      • knowledge level in a course objective: 56.25%

      • accessibility preferences: 43.75%

      • interaction level: 43.75%

      • course space in which the user is navigating: 31.25%

      • technological level: 25%

      • features of the device used to access the course: 18.75%


    Some consequences i l.jpg

    Some consequences (I)


    Some consequences ii l.jpg

    Some consequences (II)


    Evaluation plan l.jpg

    Evaluation plan

    • User interface

      • WCAG conformance

      • Tests with users (accessibility & usability)

    • Recommendations

      • User satisfaction  questionnaires

      • Task performance  interactions

      • Course outcomes  assessment on objectives

    • Methodology:

      • Study group vs. Control group


    Open issues l.jpg

    Open issues

    • Categories defined

      • Overlapping???

    • Recommendations on accessibility

      • Suggest alternative learning experiences (not just contents/formats, …)

      • Tell to modify contributions no properly tagged

      • Show user agent functionality

      • Others???

    • Large-scale formal evaluations


    Slide29 l.jpg

    Adaptive accessible design as input for runtime personalization in standard-based eLearning scenarios

    Thanks

    Olga C. Santos, Jesús G. Boticario

    ocsantos@dia.uned.es – jgb@dia.uned.es

    ADDW 2008 – York, September 22-25


  • Login