1 / 40

Designing and Evaluating Life-like Agents as Social Actors

Designing and Evaluating Life-like Agents as Social Actors. Helmut Prendinger Dept. of Information and Communication Eng. Graduate School of Information Science and Technology University of Tokyo helmut@miv.t.u-tokyo.ac.jp http://www.miv.t.u-tokyo.ac.jp/~helmut/helmut.html.

libra
Download Presentation

Designing and Evaluating Life-like Agents as Social Actors

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Designing and Evaluating Life-like Agents as Social Actors Helmut Prendinger Dept. of Information and Communication Eng. Graduate School of Information Science and Technology University of Tokyo helmut@miv.t.u-tokyo.ac.jp http://www.miv.t.u-tokyo.ac.jp/~helmut/helmut.html

  2. Short Bioeducation, experience • Master’s in Logic (1994) • U. of Salzburg, Austria, Dept. of Logic and Philosophy of Science • Dynamic modal logic (completeness, decidability) • Non-degree studies in Psychology, Linguistics, Literature • Ph.D. in Artificial Intelligence (1998) • U. of Salzburg, Dept. of Logic and Philosophy of Science and Dept. of Computer Science; U. of California, Irvine • Incomplete reasoning (deduction, hypothetical reasoning, EBL) • Post doctoral research • U. of Tokyo, Ishizuka Lab • JSPS Fellowship (4/1998-3/2000): Knowledge compilation, hypothetical reasoning • “Mirai Kaitaku” project (since 4/2000): Life-like characters, affective communication with animated agents, markup languages for animated agents, emotion recognition

  3. Social Computingmain objective and task Social Computing aims to support the tendency of humans to interact with computers as social actors. Develop technology that reinforces human bias towards social interaction by appropriate feedback in order to improve the communication between humans and computational devices.

  4. Social Computingrealization Most naturally, social computing can be realized by using life-like characters.

  5. Life-like Characters at Worksample applications Tutoring, USC Knowledge Sharing, ATR Presentation, U. of Tokyo Sales, DFKI Entertainment, MIT

  6. Life-like Charactersdesiderata • Life-like characters should be • emphatic and engaging as tutors • trustworthy as sales persona • entertaining and consistent as actors • stimulating as match-makers • convincing as presenters • (in short) … social actors • [… and competent ] • Life-like characters should enable • effective and natural communication with humans

  7. Backgroundcomputers as social actors • Humans are biased to treat computers like real people • Psychological studies show that people tend to treat computers as social actors (like other humans) • Tendency to be nicer in “face-to-face” interactions, ... • Animated agents may support this tendency if they are designed as social actors Ref.: B. Reeves and C. Nass, 1998. The Media Equation. Cambridge University Press, Cambridge.

  8. Animated Agents as Social Actorsrequirements for life-likeness Features of Life-like Characters • Synthetic bodies • Emotional facial display • Communicative gestures • Posture • Affective voice Artificial Emotional Mind Embodiment • Affect-based response • Personality • Response adjusted to social context • social role awareness • Adaptive behavior • social intelligence

  9. Outlinedesigning and evaluating life-like characters • The mind of life-like agents • Emotion, social role awareness, attitude change • Demo - Casino scenario • Implementation and character behavior scripting • Evaluating life-like characters • Using biosignals to detect user emotions • Experimental study with character-based quiz game • Book project - character scripting languages and applications

  10. SCREAM System ArchitectureSCRipting Emotion-based Agent Minds

  11. Appraisal Modulethe cognitive structure of emotions • Evaluates external events according to their emotional significance for the agent • Outputs emotions • joy, distress • happy for, sorry for • angry at • resent, gloat • … 22 in total Ref.: A. Ortony, G. Clore, A. Collins, 1988. The Cognitive Structure of Emotions. Cambridge University Press, Cambridge.

  12. Social Filter Moduleemotion expression modulating factors • Ekman and Friesen’s facial “Display Rules” (’69) • Expression and intensity of emotions is governed by social and cultural norms • Brown and Levinson (’87) on linguistic style • Linguistic style is determined by social variables: power, distance, imposition of speech acts

  13. Agent Modelcharacter profile, affect processing • Character Profile • static and dynamic features • Static features • personality traits, standards • Dynamic features • goals, beliefs, attitudes • Attitudes (liking/disliking) are an important source of emotions toward other agents • an agent’s attitude decides whether it has a positive or negative emotion (toward another agent) • “happy for”– resent; “sorry for”– gloat • an agent’s attitude changes as a result of communication • dependent on “affective interaction history”

  14. Signed Summary Recordcomputing attitude from affective interaction history winning emotional states positive emotions negative emotions joy (2) distress (1) joy (2) distress (1) distress (3) hope (2) distress (3) angry at (2) good mood(1) angry at (2) interaction history hope (2) happy for (2) gloat (1) Attitude summary value good mood(1)  +  = gloat (1) <emotion, intensity> pairs Liking if positive Disliking if negative happy for (2) time Ref.: A. Ortony, 1991. Value and emotion. In: W. Kessen, A. Ortony, and F. Craik (eds.), Memories, Thoughts, and emotions: Essays in the honor of George Mandler. Hillsdale, NJ: Erlbaum, 337-353.

  15. Updating Attitudeweighted update rule • If a high-intensity emotion of opposite sign occurs – e.g., a liked interlocutor makes the agent very angry • Agent ignores “inconsistent” new information • Agent updates summary value by giving greater weight to “inconsistent” information (“primacy of recency”, Anderson ’65) w: intensity of (winning) emotion ,   {+,} h/r: historical/recency weight   (Sitn) = (Sitn1)  h + w(Sitn)  r 3 = (3  0.25)  (5  0.75) disliking liking h-weight angry r-weight • Consequence for future interaction with interlocutor • Momentary disliking: new value is active for current situation • Essential disliking: new value replaces summary record

  16. Life-like Agentsmaking them act and speak • Realization of embodiment • 2D animation sequences • Synthetic affective speech • Technology • Microsoft Agent package (installed client-side) • JavaScript based interface in Internet Explorer • Microsoft Agent package • Controls to trigger character actions • Text-to-Speech (TTS) Engine • Voice recognition • Multi-modal Presentation Markup Language (MPML) • Easy-to-use XML-style authoring tool • Interface with SCREAM system

  17. Life-like Characters in Interactionsome demos Casino Scenario Comics Scenario Business Scenario Life-like characters that change their attitude during interaction Animated agents that storify tacit corporate knowledge Animated comics actors engaging in developing social relationships

  18. Casino Scenariolife-like characters with changing attitude • Animated advisor (“Genie”) • Emotion, personality • Changes attitude dependent on interaction history with user • Dealer (“James”), player (“Al”) • Pre-scripted behavior Genie‘s Character Profile % Personality specification personality_type(genie,agreeableness,3). personality_type(genie,extraversion,2). % Social variables specification social_power(genie,user,0,_). social_distance(genie,user,1,_). % Goals wants(genie,user_wins_game,1,_). wants(genie,user_follows_advice,4,_). % Attitude attitude(genie,user,likes,1,init). • User in the role of player of Black Jack game Implemented with MPML and SCREAM

  19. Emotional Arcadvisor’s dominant emotions depending on attitude Round 1 Round 2 Round 3 Round 4 Round 5 advisor has agreeable personality pos. attitude pos. attitude neg. attitude pos. attitude pos. attitude ignores advice ignores advice ignores advice follows advice ignores advice user looses user looses user looses user looses user wins distress (4) sorry for (4) gloat (5) sorry for (5) good mood (5) Internal intensity values advisor has agreeable personality, is socially slightly distant to user distress (1) sorry for (5) gloat (2) sorry for (5) good mood (5) Intensity values of expressed emotions

  20. Implementation

  21. Agent Scriptingsimple MPML script <!--Example MPML script --> <mpml> … <scene id=“introduction” agents=“james,al,spaceboy”> <seq> <speak agent=“james”>Do you guys want to play Black Jack?</speak> <speak agent=“al”>Sure.</speak> <speak agent=“spaceboy”>I will join too.</speak> <par> <speak agent=“al”>Ready? You got enough coupons?</speak> <act agent=“spaceboy” act=“applause”/> </par> </seq> </scene> … </mpml>

  22. Mind-Body Interfaceinterface SCREAM MPML <!--MPML script showing interface with SCREAM --> <mpml> … <consult target=”[…].jamesApplet.askResponseComAct(‘james,’al’,’5’)”> <test value=“response25”> <act agent=“james” act=“pleased”/> <speak agent=“james”>I am so happy to hear that.</speak> </test> <test value=“response26”> <act agent=“james” act=“decline”/> <speak agent=“james”>We can talk about that another time.</speak> </test> … </consult> … </mpml>

  23. “Sense-think-act” cycle Classical AI approach Internet softbots search for information on the web, robots explore their environment All the intelligence is agent-side “Annotated” environments Shift from agent intelligence to environment intelligence Semantic web, ubiquitous computing, affordance theory Agents and environments can be developed independently Alternative Viewsmart characters vs. smart environments infers “I am happy” environment instructs agent “be happy now” “acts” expresses happiness “perceives” game state “tells” available behaviors behavior repository

  24. Outline revisiteddesigning and evaluating life-like characters • The mind of life-like agents • Emotion, social role awareness, attitude change • Demo - Casino scenario • Implementation and character behavior scripting • Evaluating life-like characters • Using biosignals to detect user emotions • Experimental study with character-based quiz game • Book project - character scripting languages and applications

  25. Affective Computingwhy should a computer recognize user emotions? • Human-human communication • Based on efficient grounding mechanisms including the ability to recognize interlocutors’ emotions (frustration, confusion,…) • Humans may react appropriately upon detection of an interlocutor’s emotion (clarification upon confusion) • Human-computer communication • Computers typically lack ability to recognize user emotions • Ignoring users’ emotions causes users’ frustration • Recognizing and responding to users’ (often) negative emotions may improve users’ interaction experience Ref.: R. Picard, 1997. Affective Computing. The MIT Press.

  26. Emotion Recognitionhow can computers recognize users’ emotions? • Stereotypes • A typical visitor of a casino wants… (to win) • Communicative modalities • Facial display (face recognition) • Prosody (speech analysis) • Linguistic style (NLU) • Gestures (gesture recognition) • Posture (posture recognition) • Physiological data • Biosignals

  27. Physiological Data AssessmentProComp+ unit • EMG: Electromyography • EEG: Electroencephalography • EKG: Electrocardiography • BVP: Blood Volume Pressure • GSR: Galvanic Skin Response • Respiration • Temperature GSR BVP sensors

  28. Inferring Emotions from Biosignals Lang’s 2-dimensional emotion model excited enraged • Lang’s two dimensions • Valence- positive or negative dimension of feeling • Arousal - degree of intensity of emotional response • Biometric measures • Skin conductivity increases with arousal (Picard ’97) • Heart rate increases with negatively valenced emotions • Note • introverts reach a higher level of emotional arousal than extroverts joyful Arousal sad relaxed depressed Valence some named emotions in the arousal-valence space Ref.: Lang, P. 1995. The emotion probe: Studies of motivation and attention. American Psychologist 50(5):372–385.

  29. Junichiro Mori - Experimenter Analyser Experimental Studyeffects of a character-based interface • Aim of study • Show that a character with affective expression may improve users’ experience (= reduce frustration) of a simple quiz game • Method • Biosignals to measure skin conductance and blood volume pressure (`objective’ assessment of user experience) • Questionnaire (users’ subjective assessment) • Instruction • Addition/subtraction task (short-term memory load) • Solve a series of 30 quizzes correctly and as fast as possible • Frustration is deliberately caused by delay (in 6 out 30 quizzes) • Subjects • 20 university students (all male Japanese, approx. 24 years old) • JPY 1000.- for participation, JPY 5000.- for best score

  30. Experimental Setup

  31. Instructionmathematical quiz game timer • Add 5 numbers and subtract the i-th number (i < 5) • 1 + 3 + 8 + 5 + 4 = [21] • E.g.: subtract the 2nd number • Result: 18 It is correct. (polite language) sometimes delay here (6 – 14 sec.) • Select the correct answer by clicking the radio button next to the number • Then the character tells whether answer is correct

  32. Two Versions of the Gameaffective vs. non-affective (independent variables)

  33. Character Responsesexamples of affective/non-affective feedback I am sorry. It is wrong. (hyper-polite language) I am sorry for the delay. (polite language) Character apologizes for the delay Hanging shoulder gesture to express sorriness non-verbally Non-affective feedback “Wrong.” No non-verbal emotion expression. Non-affective feedback Character ignores the occurrence of delay.

  34. Analyzing Physiological User Data BVP could not be taken reliably BVP user response agent response DELAY segment RESPONSE segment GSR delay ends delay starts Biograph Software (Thought Technologies)

  35. Preliminary Findings9 subjects in each version (data of 2 subjects discarded) • Hypothesis (design): delay induces frustration in subjects • All 18 subjects showed significant rise of SC in DELAY segment • Corresponds to finding in behavioral psychology (if an individual is prohibited from attaining a goal, the individual experiences primary frustration) • Hypothesis (main): affective agent behavior reduces user frustration Non-affective version: mean = 0.05 Affective version: mean = 0.2 t-test (assuming unequal variance) t(16)=2.57; p = .01 DELAY segment RESPONSE segment mean values sf SC (BVP could not be taken reliably) Preliminary evaluation suggests that an animated character expressing emotions and empathy may undo some of the user’s frustration.

  36. Agents Adapting to User Emotionassumes real-time recognition of user emotions Dynamic Decision Network (simplified) ti+1 ti user’s action evidence node agent’s actions user model user’s traits user’s traits user model learning learning emotional state emotional state U bodily expressions bodily expressions QUESTION: Given user’s state at ti, which agent action will maximize agent’s expected utility at ti+1, in terms of, e.g., user’s learning and emotion? sensors evidence nodes sensors

  37. extraversion reproach shame neg valence pos valence joy eyebrows position skin conductivity heart rate vision based recognizer EMG GSR BVP Dynamics of User Emotions user personality user goals ti+1 ti succeed by myself agreeableness provide help agent’s action do nothing have fun user’s emotional state at ti+1 reproach joy shame arousal user’s emotional state at ti bodily expressions sensors Ref.: Conati, C. 2002. Probabilistic assessment of user’s emotions in educational games. Applied Artificial Intelligence 16(7-8):555–575. down(frowning) high high

  38. Outline revisiteddesigning and evaluating life-like characters • The mind of life-like agents • Emotion, social role awareness, attitude change • Demo - Casino scenario • Implementation and character behavior scripting • Evaluating life-like characters • Using biosignals to detect user emotions • Experimental study with character-based quiz game • Book project - character scripting languages and applications

  39. Book Projectcharacter scripting languages and applications • Wide dissemination of life-like character technology requires • standardized ways to represent the behavior of agents • Book will offer state-of-the-art on XML-based markup languages and tools • Scripting languages for face animation, body animation and gestures, emotion expression, synthetic speech, interaction with environment,… • Characters are already used in a wide variety of applications • Book contains some of the most successful character-based applications • Synopsis chapters on character design H. Prendinger, M. Ishizuka (Eds.) Life-like Characters. Tools, Affective Functions and Applications Springer Hardcover (in preparation) useful as Standard/Reference Book State-of-the-Art in Life-like Agents Course Book for HCI, HAI, multimedia, life-like agent applications, scripting languages,…

  40. Conclusion • Social Computing • Human-computer interaction as social interaction • Designing life-like characters as social actors • Believability-enhancing agent features • Emotion, personality, social role awareness, attitude change, familarity change • Casino demo • Future avenues – “smart” environments (character & annotated environments) • Evaluating life-like characters as social actors • Experimental study using user’s biosignals • Life-like characters’ affective response may undo some of the user’s negative feeling • Future avenues – real-time adaptivity of agent behavior to user’s emotion, decision-theoretic approach to agent behavior

More Related