1 / 27

Integrating Affect Sensors in an Intelligent Tutoring System

Integrating Affect Sensors in an Intelligent Tutoring System. Sidney K. D’Mello, Scotty D. Craig, Barry Gholson, Stan Franklin, Rosalind Picard, and Arthur C. Graesser {sdmello|scraig|jbgholsn|franklin|a-graesser}@memphis.edu picard@media.mit.edu. Overview. Introduction

nitza
Download Presentation

Integrating Affect Sensors in an Intelligent Tutoring System

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Integrating Affect Sensors in an Intelligent Tutoring System Sidney K. D’Mello, Scotty D. Craig, Barry Gholson, Stan Franklin, Rosalind Picard, and Arthur C. Graesser {sdmello|scraig|jbgholsn|franklin|a-graesser}@memphis.edu picard@media.mit.edu

  2. Overview • Introduction • Theoretical Background • Empirical Data Collection • Sensory Channels • Emotion Classification • Summary

  3. Auto Tutor • A fully automated computer tutor • Simulates human tutors • Holds conversations with students in natural language • Constructivist theories of learning • AutoTutor’s naturalistic dialog • Presents problems to the learner • Gives feedback to student • Pumps, prompts and assertions • Identifies and correctsmisconceptions • Answers the student’s questions • Effective learning system • Tested on more than 1000 students, Average sigma of .8

  4. Project Goals • Identify the emotions that are most important during learning with AutoTutor • Find methods to reliably identify these emotions during learning by developing an Emotion Classifier • Program AutoTutor to automatically recognize and respond appropriately to emotions exhibited by learners, and to assess any learning gains • Test and augment theories that systematically integrate learning and emotion into educational practice.

  5. 2. Theoretical Background • The Stein & Levine Model • Goals, Emotions, & Learning • The Kort, Reilly, & Picard Model • The Emotional Learning Spiral • The Cognitive Disequilibrium Model

  6. Goals, Emotions, & Learning(Stein & Levine, 1991) • Why is behavior carried out? • Achieving and maintaining goal states that ensure survival • People prefer to be in certain states and prefer to avoid others (hedonistic model) • Goal-directed, problem-solving approach • Characteristic of emotional experience • Assimilate information into knowledge schemes • Emotional experience is associated with understanding of new information • learning normally occurs during an emotional episode

  7. The Emotional Learning Spiral(Kort, Reilly & Picard, 2001) Constructive learning Learning Axis II I Affective Axis Negative Affect Positive Affect III IV Un-learning

  8. Cognitive Disequilibrium Model(Graesser & Olde 2003) • Cognitive disequilibrium • Important role in comprehension and learning process • Occurs when there is a mismatch with expectations • Activates conscious cognitive deliberation, questions, and inquiry • Aims to restore cognitive equilibrium • Cognitive disequilibrium and affective states • Confusion often accompanies cognitive disequilibrium • Confusion indicates an uncertainty about what to do next

  9. 3. Empirical Data Collection • The Observational Study • The Emote-Aloud Study • The Gold Standard Study • Evaluating the Affect Sensitive Auto Tutor

  10. The Observational Study Predictions: • Positive relationship with learning • Flow(Csikszentmihalyi, 1990) • Confusion(Graesser & Olde, 2003; Kort, Reilly, & Picard, 2001) • Eureka (Kort, Reilly, & Picard, 2001) • Negative relationship with learning • Boredom(Csikszentmihalyi, 1990; Miserandino, 1996) • Frustration(Kort, Reilly, & Picard, 2001; Patrick et al, 1993) • : Correlations:

  11. Emote Aloud Study • Emotions of Interest: • Anger, Boredom*, Confusion*, Contempt, • Curious, Disgust, Eureka, Frustration* • Methodology: • 7 emote-aloud participants total : Total of 10 hours of interactions • Participants given list of 8 emotions with descriptions • Clips coded from 3 seconds before the participant started talking • Two raters coded video clips with reliability 0.9 (Kappa) • Preliminary Results: • Frequent Itemsets: • Frustration : {1}, {2}, {1,2}, {14} • Confusion : {4}, {7}, {4,7}, {12} • Boredom : {43} • Association Rules: • Frustration : {1} → {2}, {2} → {1} • Confusion : {7} → {4}

  12. Gold Standard Study • Procedure: • Session one • Participants (N=30) interact with AutoTutor • Collect data with sensors: BPMS, Blue eyes camera, AutoTutor logs • Participants view their videos and give ratings • Session two (one week later) • Participants view another participants video and give affect indications every 20 seconds • Expert judges (N=2) give affect ratings • Affective States: • Boredom, Confusion, Flow, Frustration, Delight, Neutral, Surprize

  13. 4. Sensory Channels • Three current methods • Posture Patterns - Body Pressure Measurement System • Facial Expressions - IBM Blue eyes camera • Conversational Cues - AutoTutor text dialog • Two other possible methods • Force exerted on mouse • Force exerted on keyboard

  14. Visual – IBM Blue eyes camera Posture – Body Pressure Measurement System Auto Tutor • Pressure – force sensitive mouse and keyboard AutoTutor text dialog

  15. Facial ExpressionsThe IBM Blue Eyes Camera Red Eye Effect IBM Blue Eyes Camera Eyebrow Templates

  16. Posture PatternsThe Body Pressure Measurement System

  17. Posture PatternsClassification Results (Mota & Picard 2003) • Static Posture Patterns : • Leaning Forward, Leaning Forward Left, Learning Forward Right Leaning Back, Leaning Back Left, Leaning Back Right, Sitting Upright, Sitting on the Edge of Seat, Slumping back • Accuracy (87.64% ) (10 subjects, 5 training, 5 testing) • Recognizing Interest: • High interest, Low interest, Taking a break • Accuracy: • 82.25%, (8 subjects) • 76.49 % (2 new subjects)

  18. Conversational CuesAuto Tutor’s Text Dialog Student answers LSA matches

  19. Conversational CuesRelevant Channels • Speech Act Classifier: • Metacommunicative, Metacognitive, Shallow Comphrension, • Deep Comprehension, Contribution • Cosine Scores (local and global): • Max Good Expectation Match, Max Bad Expectation Match • Delta Change • Response Content • Number of characters, Number of words • Advancer: • Hint, Prompt, Assertion, Prompt Completion • Pump, Splice, Summary, Misconception Verification • Feedback: • Positive, Neutral Positive, Neutral Neutral, • Neutral Negative, Negative

  20. 5. Emotion Classification • Approaches to Classification • Individual Emotion Classifiers: • Standard Classifiers • Biologically Motivated Classifiers • Classifier Integration

  21. Approaches to Emotion Classification • Analysis level • Fine Grained (pixels) – M.I.T. • Coarse (action units, posture patterns) – University of Memphis • Sensory Channels: • Integrated approach: • Classify all sensory data at the same time. • Distributed approach: • Individually classify each sensory channel. • Integrate each classification to obtain a super classification

  22. Biologically Motivated ClassifiersBackground • Olfaction in rabbits—Is it a fox or a carrot? • Freeman—Ten years looking at patterns • Not patterns, but basins of attraction • Sniff destabilizes olfactory bulb • Settles into some attractor basin • Which basin identifies the odor • Freeman - Theoretical model – The K Model • Kozma/Harter - Computational Model – The KA Model

  23. Biologically Motivated ClassifiersKIII as a Classifier • Compares favorably with: • Statistical classification methods • Feed forward neural network systems • Performance: • More robust • More noise tolerant • Classifies objects not linearly separable by any set of features • Learning: • Hebbian Reinforcement • Habituation • New categories can be added without loss of existing categories

  24. Classifier Integration • Problem: • Three sensory channels • Unique output at different intervals • Different interpretation of output from • Solution: • Network with nodes representing emotions. • Emotion nodes are connected with excitatory and inhibitory links. • Activation (excitation or inhibition) is spread among links. • Sensory channels activate various emotion nodes with varying degrees of activation. • Activations decay over time • Over time, an emotion node with activation above a threshold is chosen as the representative emotion.

  25. Current StatusEmpirical Data Collection • Current Status: • Observation Study - complete • Emote-aloud study - complete • Gold standard study – data collection complete • Future Work: • Preliminary analysis of gold standard study data • Action Unit encoding of gold standard study data • Replication of Gold standard study with Speech Recognition

  26. Current StatusEmotion Classification Current Status: Associating action units with emotions • KAIII implemented and tested. • BPMS Cluster analysis complete • Dialog channels mined • Blue Eyes software implemented • Future Work: • Detection of interesting emotion sequences. • Individual sensory channel classification. • Classifier integration.

  27. Acknowledgements • Funding sources for the University of Memphis: • NSF ITR 0325428 • Steelcase (BPMS) • Researchers: • The University of Memphis: • Dr. Max Louwerse, Patrick Chipman, Jeremiah Sullins, • Bethany McDaniel, Amy Witherspoon • MIT: • Dr. Barry Kort, Dr. Rob Reilly, Ashish Kapoor

More Related