1 / 47

cerg.csse.monash.au/pedant

Applying agent technology to evaluation tasks in e-learning environments. http://cerg.csse.monash.edu.au/pedant. Selby Markham, Jason Ceddia & Judy Sheard CSSE, Monash Colin Burvill & John Weir Mechanical and Manufacturing Engineering, U M Bruce Field Mechanical Engineering, Monash

rimona
Download Presentation

cerg.csse.monash.au/pedant

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Applying agent technology to evaluation tasks in e-learning environments http://cerg.csse.monash.edu.au/pedant

  2. Selby Markham, Jason Ceddia & Judy Sheard CSSE, Monash Colin Burvill & John Weir Mechanical and Manufacturing Engineering, U M Bruce Field Mechanical Engineering, Monash Linda Stern & Leon Sterling CSSE, UM

  3. The PEDANT project grew out of a desire/need to understand how current students deal with electronic materials.

  4. Aims to investigate The relationship between the way students use on-line and interactive educational tools and the quality of their learning experience. Using automated, agent-oriented software tools

  5. The educational tools SiMLED A simulator for exploring engineering principles. Engineering

  6. The educational tools Algorithms in Action : AIA An interactive, visual tool for exploring programming algorithms CSSE

  7. The educational tools MOMUS Tutor A graphics-based tutorial system on mechanical engineering principles Engineering

  8. The educational tools Web Industrial Experience Resource : WIER A Web-based resource to support students in the Industrial Experience project CSSE

  9. What is a software agent? A bit like TRON

  10. What is a software agent? Your computer Virus Checker

  11. What is a software agent? It has a purpose It can be given intelligence It is responsive It can be designed to learn It acts independently

  12. What is a software agent? It has a purpose It is responsive It can act independently

  13. It has a purpose To monitor software output To collect appropriate output To organise that output

  14. It is responsive The information can be analysed Notifications can be given

  15. It can act independently When initiated it needs no support

  16. It can be given intelligence (Phase 2 of the project)

  17. But does it have Ethics?

  18. But why use agents?

  19. Software tools create issues that are not easily addressed by self-report techniques or observational techniques.

  20. Have you tried to analyse the log data from a Web-based tool? The extent of the data is mind-boggling

  21. Have you tried to trace the informational path used by the learner? How often have you been unable to trace your own path when working on the Web?

  22. The development of agent technology is a productive direction for achieving both short-term and long-term goals

  23. Agent technology is technologically compatible with the teaching/learning tasks. • Agent technology can provide data about what the learner is doing rather than what he/she remembers that he she was doing. • Agent technology provides a tool with a highly pervasive ability to carry out functional formative assessment.

  24. Conceptualising the basic process

  25. Learning Objectives Learning Tasks Responses Learner Motives Learner Behaviour Learner Outcomes Evaluation Inferred Behaviour Measures Educator Learner Software

  26. 2003 Learning Objectives Learning Tasks Responses Learner Motives Learner Behaviour Learner Outcomes Evaluation Inferred Behaviour Measures Educator Learner Software

  27. 2003 Learning Objectives Learning Tasks Responses Learner Motives Learner Behaviour Learner Outcomes Evaluation Inferred Behaviour Measures • We will have achieved the primary aim of developing: • A rationale for defining pedagogical elements • Prototypical agents that can monitor the software • Have analysis models that can apply to evaluation tasks Educator Learner Software

  28. 2003 Learning Objectives Learning Tasks Responses Learner Motives Learner Behaviour Learner Outcomes Evaluation Inferred Behaviour Measures • The evaluation component: • The evaluation of the functional usability will have begun • The format for evaluating the processes being used by students - a broader view of formative assessment Educator Learner Software

  29. 2003 Learning Objectives Learning Tasks Responses 2004 Learner Motives Learner Behaviour Learner Outcomes 2004 2004 Evaluation Inferred Behaviour Measures 2004 There are many more tasks to be done to fill in the underlying matrix Educator Learner Software

  30. The agent in evaluation

  31. Provides are means of monitoring process and relating process to educational goals

  32. This creates what can be called a process evaluation – as defined in the broader evaluation literature. This is similar to formative evaluation but more as originally conceptualised by Scriven.

  33. The current process model for the project

  34. e.g.

  35. Action (User, Type, Time, [parameter list] )

  36. Action (User, Type, Time, [parameter list] ) action (user code, page, time [URLdata])

  37. Action (User, Type, Time, [parameter list] ) action (user code, page, time [URLdata]) download (403, file_manager, 994867577, [doc_rep: 0 Requirements-Model.doc])

  38. The practical meaning could be Learning tasks that the software enables: exploring a particular area accessing a particular resource

  39. Learner behaviours: passive lost wandering directed interaction linear, orderly interaction exploration (at different levels of depth)

  40. FIN

More Related