1 / 18

Towards a framework for assessment in pervasive environnments

Towards a framework for assessment in pervasive environnments. Lilia Cheniti Belcadhi , PRINCE, Research Unit, ISITC, University of Sousse, Tunisia Serge Garlatti , Computer Science Department, Telecom Bretagne, France. Outline. General context Use case scenario for pervasive assessment

erma
Download Presentation

Towards a framework for assessment in pervasive environnments

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Towards a framework for assessment in pervasive environnments Lilia ChenitiBelcadhi, PRINCE, Research Unit, ISITC, University of Sousse, Tunisia Serge Garlatti, Computer Science Department, Telecom Bretagne, France

  2. Outline • General context • Use case scenario for pervasive assessment • Learning process • Assessment process • Challenges • Semantic models • Conclusion and future work

  3. General context • Assessment is an important part of the learning process as it allows the learner to keep track of their progress in the knowledge acquisition process. • Self assessment • Interactive assessment • Peer assessmentis a subcategory of interactive assessment • Objective: deliver assessment activities adapted to learner specific situation and context  Pervasive assessment using web 2.0 and semantic web

  4. Use case scenario for pervasive assessment • Student engineers attend a course on information systems and design methodologies. • As part of their course projects, Students have to work in groups to propose an information system that manages specific data in a given company department. • This activity is deployed in three main phases. • Phase 1: drafting a specification document; • Phase 2 : drafting a modeling document, • Phase 3: prototyping of the system.

  5. Phase 1: Drafting specification document • Objective : Preparation a specification document for a company information system in small groups of learners. • Two processes which can occur either sequentially or in parallel. • A learning process • An assessment process, • Work organization : Group work • Coordinator : supervises the workflow and information delivery between the members. • Members : access blogs to get a clear idea on the work progress, the planning follow-up and the list of tasks to be carried out next.

  6. Learning Process

  7. Learning process : Task 1 • Collect from the Web information on structure of a specification document • Download some reference documents or • Share the bookmarks/links • Search references for books at university library • Communication between peers using e.g. micro blogging tool

  8. Learning process : Task 2 • Collect information at company and then share it. • Interview prospective users. • Interviews recorded and shared through Video sharing tools, • Collect on information on the existing systems and then share it. • Suggest ways to improve the data and work process. and share suggestions with group members on site. • Validate proposed improvements through a visit to the company employees and discussions. • Access to discussions results and accordingly redefine the specifications

  9. Learning process : Task 3 • Write collaboratively specification document using e.g. Google Docs • Share specification document with other groups of learners; • Access and download other specifications documents.

  10. Assessment Process Pervasive assessment Pervasive Self assessment Pervasive Peer assessment

  11. Pervasive Self Assessment • Knowledge checking process depends on learners’ current context. • Examples, • Assessment of knowledge on Database architecture for Learners at company • Assessment of knowledge on specification of information systems for Learners at library. • Process : • Learners receive a set of randomly selected questions to be answered. • Get recommendation to course parts to review before moving to practice. • receive a list of names including friends or tutors connected on social platforms at the same time, and who can provide them with help with their practice work.

  12. Peer Assessment • Learners assess the work done by other learners through a collaborative work on a specification document drafted by another group of its choice. • Assessment criteria prepared in advance and shared by tutor. • Learners may post their comments and ask for clarification of some assessment criteria.

  13. Peer Assessment (Cont.) • Every learner deliver an assessment scheme that includes his/her appreciation and share it with his/her group members. • Group coordinator • summarizes the grades given for each assessment criteria. • shares the group assessment report of the selected specification document, containing group overall feedback. • Benefits : • Gives group the opportunity to review its own specification document and propose a new version. • Enrich Groups 'knowledge

  14. Challenges • Diversity of web 2.O tools: • Adaptively recommend tools appropriate for the type of assessment to deliver and make a semantic usage of the information that is transferred. • Difficulty to search and filter information: I • Search and filter the information appropriate to their individual needs and to the nature of the context. • Dynamic attribution of resources and resources interoperability: • Provide a flexible process of resource attribution depending on availability and knowledge of learners and tutors. • Find a way to efficiently retrieve and exchange key resources. • Enhance interoperability of assessment resources by compliance to same standards such as QTI.

  15. Semantic Models

  16. Semanticmodels • User model Timely information on users and contexts of learning to be able to personalize the assessment according to learners’ preferences and characteristics • Assessment model • Selection of appropriate assessment strategy and recommend assessment resources in a given context. • Maintaining an e-portfolio for every learner, which contains their history and assessment results. • Context model Identify and to recognize the current context of every learner and reference context ontology.

  17. Semanticmodels (Cont.) • Domain model • Have an efficient manner to retrieve key resources • Metadata annotation of assessment resources • Reference to domain ontology • Resource model (metadata model) • Domain and resource models are used to index resources. • Some metadata can be generated automatically (sometimes on the fly) according to common vocabularies like Dublin Core, SKOS, SIOC, FOAF, etc. • Most of these vocabularies are lightweight ontologies that can fit well database schemas. • On the contrary, learners and/or teachers need to define the relevant domain concepts describing a post

  18. Conclusion and future work • Scenario including self and peer assessment in pervasive environment • Need for semantic models to enhance information retrieval and discovery in pervasive environments to deliver assessment • Need for models description and formalization

More Related