emotional machines l.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Emotional Machines PowerPoint Presentation
Download Presentation
Emotional Machines

Loading in 2 Seconds...

play fullscreen
1 / 28

Emotional Machines - PowerPoint PPT Presentation


  • 101 Views
  • Uploaded on

Emotional Machines. Presented by Chittha Ranjani Kalluri. Why Can’t…. We have a thinking computer? A machine that performs about a million floating-point operations per second understand the meaning of shapes?

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Emotional Machines' - sezja


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
emotional machines

Emotional Machines

Presented by

Chittha Ranjani Kalluri

why can t
Why Can’t…
  • We have a thinking computer?
  • A machine that performs about a million floating-point operations per second understand the meaning of shapes?
  • We build a machine that learns from experience rather than simply repeat everything that has been programmed into it?
  • A computer be similar to a person?

The above are some of the questions facing computer designers and others who are constantly striving to build more and more ‘intelligent’ machines.

so what s intelligence
So, what’s intelligence?

According to en.wikipedia.org:

“Intelligence is a general mental capability that involves the ability to reason, plan, solve problems, think abstractly, comprehend ideas and language, and learn.”

what does this mean for current machines
What does this mean for current machines?
  • Definitely not that they’re not intelligent!
  • Some amount of intelligence has to be built in
  • How can that be done?
  • Designers looked closely at how humans
    • Behave
    • Express themselves
    • Process information
    • Solve problems
expressing ourselves
Expressing ourselves
  • Body language
  • Facial expressions
  • Tone of voice
  • Words we choose
  • All of them vary based on situation
  • What we implicitly convey - emotion
what is emotion
What is emotion?

In psychology and common use, emotion is the language of a person's internal state of being, normally based in or tied to their internal (physical) and external (social) sensory feeling. Love, hate, courage, fear, joy, and sadness can all be described in both psychological and physiological terms.

do machines need emotion
Do machines need emotion?
  • Machines of today don’t need emotion
  • Machines of the future would need it to
    • Survive
    • Interact with other machines and humans
    • Learn
    • Adapt to circumstances
  • Emotions are a basis for humans to do all the above
what is an emotional machine
What is an emotional machine?
  • An intelligent machine that can recognize emotions and respond using emotions
  • Concept proposed by Marvin Minsky about a year ago in his book ‘The Emotion Machine’
  • Example: the WE-4RII (Waseda Eye No. 4Refined II), being developed at the Waseda University, Japan
the we 4rii
The WE-4RII
  • Simulates six basic emotions
    • Happiness
    • Fear
    • Surprise
    • Sadness
    • Anger
    • Disgust
  • Recognizes certain smells
  • Detects certain types of touch
  • Uses 3 personal computers for communication
  • Still not as close to an emotional machine as we would want
the we 4rii10
The WE-4RII
  • Happiness
  • Fear
the we 4rii11
The WE-4RII
  • Surprise
  • Sadness
the we 4rii12
The WE-4RII
  • Anger
  • Disgust
maybe
Maybe…
  • We’re not there…yet!
  • So how do we get from to
characteristics of multi modal eliza
Characteristics of multi-modal ELIZA
  • Based on message passing on blackboard
  • Input – user’s text string
  • Output – sentences and facial displays
  • Processing module consists of
    • NLP layer
    • Emotional recognition layer
      • Constructs facial displays
nlp layer
NLP Layer
  • String converted to list of words by parser
  • Spelling checked
  • Abbreviations replaced
  • Slang words and codes replaced with correct ones
  • Some words replaced with synonyms by thesaurus
  • Input matched with predefined patterns by syntactic-semantic analyzer
  • Longest matching string used to generate reply
nlp layer17
NLP Layer
  • Repetition recognition ensures dialog does not enter loop
  • Rules written in AIML (Artificial Intelligence Markup Language)
  • Pragmatic analysis module checks reply against user preferences collected during conversation, and against goals and states of system
emotion recognition layer
Emotion recognition layer
  • Emotive Lexicon Look-up Parser used to extract emotion eliciting factors
  • Bases it on a lexicon of words having emotional content
  • 247 words, each with a natural number intensity
  • Overall emotional content of a string got from seven ‘thermometers’ which get updated when an emotionally rich word is found
emotion recognition layer19
Emotion recognition layer
  • Emotive Labeled Memory Structure Extraction labels each pattern and corresponding rules
  • Two additional AIML tags used – ‘affect’ and ‘concern’: positive, negative, joking, normal
  • Goal-Based Emotion Reasoning stores user’s personal data
  • Two knowledge bases to determine affective state
    • Stimulus response to user’s input
    • Result of cognitive process of conversation to convey reply
preference rules examples
Preference rules - examples
  • IF (user is happy) AND (user asks question) AND (systems reply is sad) AND (situation type of user is not negative) AND (highest thermo is happy) THEN reaction is joy.
  • IF (user is sad) AND (systems reply is sad) AND (situation type of user is joking) AND (situation type of the system is negative) AND (maximum affective thermo is sad) THEN reply is resentment.
facial display selection
Facial display selection
  • Intensity of an emotion must exceed a threshold level before it can be expressed externally
  • If an emotion is active, system calculates values of all thermometers
  • Thermometer having highest value chosen as emotion
  • Intensity of emotion determines facial display
other work in this area
Other work in this area
  • Emotionally Oriented Programming (EOP)
    • Allows programmers to explicitly represent and reason about emotions
    • Can build Emotional Machines (EMs) – intelligent software agents with explicit programming constructs for concepts like mood, feelings, temperament
    • Inspiration: thoughts and feelings are intertwined
      • Analysis of thought inspires feelings
      • Feelings inspire creation of thoughts
other work in this area23
Other work in this area
  • Emotionally Oriented Programming (EOP)
other work in this area24
Other work in this area
  • Emotional Model for Intelligent Response (EMIR)
    • Developed by Mindsystems, an Australian company
    • Includes simulations for feelings such as boredom!
    • Methodology:
      • Looks at factors influencing a character
        • Success at achieving goals
        • Levels of a character’s control over situation
      • Compares this “state of mind” to a database of human responses mapped over time
    • Was in demo stage in 2002
other work in this area25
Other work in this area
  • Emotionally Rich Man-machine Intelligent System (ERMIS)
    • Aims to develop a prototype system for human-computer interaction that can interpret its user’s attitude or emotional state, e.g., activation/ interest, boredom, and anger, in terms of their speech and/or their facial gestures and expressions
    • Adopted techniques include linguistic speech analysis, robust speech recognition, and facial expression analysis
other work in this area26
Other work in this area
  • Net Environment for Embodied, Emotional Conversational Agents (NECA)
    • Promotes concept of multi-modal communication with animated synthetic personalities
    • Key challenge - the fruitful combination of different research strands including situation-based generation of natural language and speech and the modeling of emotions and personality.
conclusion
Conclusion

The question is not whether intelligent machines can have emotions, but whether machines can be intelligent without any emotions.

Marvin Minsky, The Society of Mind

bibliography
Bibliography
  • Emotional machines – http://www.emotionalmachines.com
  • Emotional machines – Do we want them? - http://www.zdnet.com.au/news/communications/0,2000061791,20266134,00.htm
  • Marvin Minsky Home Page - http://web.media.mit.edu/~minsky/
  • Multi-Modal ELIZA - http://mmi.tudelft.nl/pub/siska/_TSD%20my_eliza.pdf
  • The WE4-RII - http://www.takanishi.mech.waseda.ac.jp/research/eyes/
  • Small Wonder - http://www.smallwonder.tv/
  • The HUMAINE Portal - http://emotion-research.net
  • ERMIS - http://manolito.image.ece.ntua.gr/ermis
  • NECA - http://www.oefai.at/NECA