A biologically motivated software architecture for an intelligent humanoid robot
Download
1 / 71

- PowerPoint PPT Presentation


  • 256 Views
  • Uploaded on

A Biologically Motivated Software Architecture for an Intelligent Humanoid Robot Richard Alan Peters II, D. Mitchell Wilkes, Daniel M. Gaines, and Kazuhiko Kawamura Center for Intelligent Systems Vanderbilt University Nashville, Tennessee, USA Intelligence

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about '' - emily


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
A biologically motivated software architecture for an intelligent humanoid robot l.jpg

A Biologically Motivated Software Architecture for an Intelligent Humanoid Robot

Richard Alan Peters II, D. Mitchell Wilkes,

Daniel M. Gaines, and Kazuhiko Kawamura

Center for Intelligent Systems

Vanderbilt University

Nashville, Tennessee, USA


Intelligence l.jpg
Intelligence Intelligent Humanoid Robot

The ability of an individual to learn from experience, to reason well, to remember important information, and to cope with the demands of daily living. (R. Sternberg, 1994).

  • Intelligence has emerged through evolution and is manifest in mammals. The processes of designing an intelligent robot might be facilitated by mimicking the natural structures and functions of mammalian brains.


Topics l.jpg
Topics Intelligent Humanoid Robot

  • Mammalian Brains

  • ISAC, the Vanderbilt Humanoid Robot

  • A Control System Architecture for ISAC

  • A Partial Implementation

  • Research Issues


The structure of mammalian brains l.jpg
The Structure of Mammalian Brains Intelligent Humanoid Robot

Krubitzer, Kass, Allman, Squire

  • The evolution of structure

  • Common features of neocortical organization

  • Species differences

  • Memory and association

  • Attention

  • Implications for robot control architectures


The evolution of structure l.jpg
The Evolution of Structure Intelligent Humanoid Robot

Figure: Leah Krubitzer


Common features of neocortical organization l.jpg
Common features of Neocortical Organization Intelligent Humanoid Robot

  • Somatosensory Cortex (SI, SII)

  • Motor Cortex (M)

  • Visual Cortex (VI, VII)

  • Auditory Cortex (AI)

  • Association Cortex

  • Size differences in cortical modules are disproportionate to size differences in cortex


Common features of neocortical organization7 l.jpg
Common features of Neocortical Organization Intelligent Humanoid Robot

Figure: Leah Krubitzer


Species differences l.jpg
Species Differences Intelligent Humanoid Robot

  • Sizes and shapes of a specific cortical field

  • Internal organization of a cortical field

  • Amount of cortex devoted to a particular sensory or cognitive function

  • Number of cortical fields

  • Addition of modules to cortical fields

  • Connections between cortical fields


Memory a functional taxonomy l.jpg
Memory: a Functional Taxonomy Intelligent Humanoid Robot

Squire

  • Immediate memory: data buffers for current sensory input; holds information for about 0.1s

  • Working memory: scratch-pads, e.g. phono-logical loop, visuospatial sketch pad; the representation of sensory information in its absence

  • Short term memory (IM & WM) is a collection of memory systems that operate in parallel

  • Long-term memory: can be recalled for years; different physically from STM


Memory biological mechanisms l.jpg
Memory: Biological Mechanisms Intelligent Humanoid Robot

  • Immediate memory — chemicals in synapse

  • Working memory — increase in presynaptic vesicles; intra-neuron and inter-neuron protein release and transmitter activity

  • Long-term memory — growth of new synapses; requires transcription of genes in neurons.


Association l.jpg
Association Intelligent Humanoid Robot

  • The simultaneous activation of more than one sensory processing area for a given set of external stimuli

  • A memory that links multiple events or stimuli

  • Much of the neocortex not devoted to sensory processing appears to be involved in association


Memory and sensory data bandwidth l.jpg
Memory and Sensory Data Bandwidth Intelligent Humanoid Robot

  • Bandwidth of signals out of sensory cortical fields is much smaller than input bandwidth

  • Sensory cortical fields all project to areas within association cortex

  • Suggests: Environment is scanned for certain salient information, much is missed. Previous memories linked by association fill in the gaps in information.


Attention a definition l.jpg
Attention: a Definition Intelligent Humanoid Robot

  • An apparent sequence of spatio-temporal events, to which a computational system or subsystem allocates a hierarchy of resources.

  • In that sense, the dynamic activation of structures in the brain is attentional .


Attention some types l.jpg
Attention: Some Types Intelligent Humanoid Robot

  • Visual — where to look next

  • Auditory — sudden onset or end of sound

  • Haptic — bumped into something

  • Proprioceptic — entering unstable position

  • Memory — triggered by sensory input

  • Task — action selection

  • Conscious — recallable event sequence


Attention executive control l.jpg
Attention: Executive Control Intelligent Humanoid Robot

Figure: Posner & Raichle


Mammalian brains l.jpg
Mammalian Brains Intelligent Humanoid Robot

  • Have sensory processing modules that work continually in parallel

  • Selectively filter incoming sensory data and supplement that information from memory through context and association

  • Exhibit dynamic patterns of activity through local changes in cellular metabolism — shifts in activation


Isac a two armed humanoid robot l.jpg
ISAC, a Two-Armed Humanoid Robot Intelligent Humanoid Robot


Physical structure of isac l.jpg
Physical Structure of ISAC Intelligent Humanoid Robot

  • Arms: two 6 DOF actuated by pneumatic McKibben artificial muscles

  • Hands: anthropomorphic, pneumatic with proximity sensors and 6-axis FT sensors at wrists

  • Vision: stereo color PTV head

  • Audition: user microphone

  • Infrared motion sensor array


Isac hardware under construction l.jpg
ISAC Hardware under Construction Intelligent Humanoid Robot

  • Hybrid pneumatic / electric anthropomorphic hand

  • Head mounted binaural microphone system

  • Finger tip touch sensors


Computational structure of isac l.jpg
Computational Structure of ISAC Intelligent Humanoid Robot

  • Network of standard PCs

  • Windows NT 4.0 OS

  • Special hardware limited to device controllers

  • Designed under Vanderbilt’s Intelligent Machine Architecture (IMA)


Low level software architecture ima l.jpg
Low-Level Software Architecture: IMA Intelligent Humanoid Robot

  • Software agent (SA) design model and tools

  • SA = 1 element of a domain-level system descr.

  • SA tightly encapsulates all aspects of an element

  • SAs communicate through message passing

  • Enables concurrent SA execution on separate machines on a network

  • Facilitates inter-agent communication w/ DCOM

  • Can implement almost any logical architecture


Primitive agent types l.jpg
Primitive Agent Types Intelligent Humanoid Robot

  • Hardware: provide abstractions of sensors and actuators, and low level processing and control (e.g., noise filtering or servo-control loops).

  • Behavior: encapsulate tightly coupled sensing - actuation loops. May or may not have runtime parameters .

  • Environment: process sensor data to update an abstraction of the environment. Can support behaviors such as ``move-to'' or ``fixate'' which require run-time parameters.

  • Task: encapsulate decision-making capabilities, and sequencing mechanisms for hardware, behavior, and environment agents.


Agent object model l.jpg
Agent Object Model Intelligent Humanoid Robot

  • The agent object model describes how an agent network, defined by the robot-environment model, is constructed from a collection of component objects


Ima component objects l.jpg
IMA Component Objects Intelligent Humanoid Robot

  • Agent Comp. — agent interfaces to manager and to persistent streams

  • Policy Comp. — encapsulates an OS thread

  • Representation Comp. — a DCOM object that communicates an agent’s state to other agents

  • Mechanism Comp. — configurable objects that can be invoked to perform one of a set of computations

  • Agent Links — interfaces defined by representations

  • Relationship Comp. — manage a set of agent links to selectively update and / or use each participant link


Properties of ima l.jpg
Properties of IMA Intelligent Humanoid Robot

  • Granularity — multiple logical levels

  • Composition — agents can incorporate agents

  • Reusable — can be combined for new functionalities

  • Inherently Parallel — asynchronous, concurrent op.

  • Explicit Representation — sensor info is ubiquitous

  • Flat Connectivity — all agents are logically equal w.r.t. sensory info access and actuator commands

  • Incremental — All modules that produce commands for the hardware work in an incremental mode



A bio inspired control architecture l.jpg
A Bio-Inspired Control Architecture Intelligent Humanoid Robot

  • IMA can be used to implement almost any control architecture.

  • Individual agents can tightly couple sensing to actuation, and incorporate each other a la subsumption (Brooks).

  • IMA Inter-agent communications facilitate motor schemas (Arkin).

  • Composition of agents which have flat connectivity enables hybrid architectures


Isac control system architecture l.jpg
ISAC Control System Architecture Intelligent Humanoid Robot

  • Primary Software Agents

  • Sensory EgoSphere

  • Attentional Networks

  • Database Associative Memory

  • Attentional Control via Activation

  • Learning

  • System Status Self Evaluation


Example primary software agents l.jpg
Example Primary Software Agents: Intelligent Humanoid Robot

  • Visual attention

  • Color segmentation

  • Object recognition

  • Face recognition

  • Gesture recognition

  • Vision

  • Audition

  • Aural attention

  • Sound segmentation

  • Speech recognition

  • Speaker identification

  • Sonic localization


Example primary software agents30 l.jpg
Example Primary Software Agents Intelligent Humanoid Robot

  • L & R Arm control

  • L & R Hand control

  • PTV motion

  • Others

  • Motor

  • Infrared motion det.

  • Peer agents

  • Object agents

  • Attention agent

  • Sensory data recd’s.


Higher level software agents l.jpg
Higher Level Software Agents: Intelligent Humanoid Robot

  • Robot self agent

  • Human agent

  • Object agents (various)

  • Visually guided grasping

  • Converse

  • Reflex control

  • Memory association

  • Visual tracking

  • Visual servoing

  • Move EF to FP

  • Dual arm control

  • Person Id

  • Interpret V-Com

  • Reflex control


Agents and cortical fields l.jpg
Agents and Cortical Fields Intelligent Humanoid Robot

  • Agents can be designed to be functionally analogous to the common field structure of the neocortex.

  • Visual, auditory, haptic, proprioceptic, attentional, and memory association agents remain on constantly and always transform the current sensory inputs from the environment


Atlantis a three layer architecture l.jpg
Atlantis: a Three Layer Architecture Intelligent Humanoid Robot

  • Deliberator — world model, planner

  • Sequencer — task queue, executor, monitors

  • Controller — sensor / actuator coupled behaviors

  • Erran Gat


Atlantis general schematic l.jpg
Atlantis: General Schematic Intelligent Humanoid Robot

  • Figure: Erran Gat


Three layer control with ima l.jpg
Three-Layer Control with IMA Intelligent Humanoid Robot

  • Elements of control layer: agents.

  • Sequencing: through links depending on activation vectors. (Due to flat connectivity.)

  • Deliberation: Various agents modify the links and activation levels of others. (Due to composability.)


Virtual three layer architecture l.jpg
Virtual Three-Layer Architecture Intelligent Humanoid Robot

  • Deliberative Agent

Sn

IMA Agent

S1

S3

...

S2

activation link

max activation


3 layer control through schemas l.jpg
3-Layer Control through Schemas Intelligent Humanoid Robot

  • Agents compete with each other for control of other agents and the use of hardware resources.

  • They do this by modifying activation vectors associated with agent relationships.

  • Thus, the sequencer is equivalent to a motor schema


Simple task agent operation l.jpg

Handoff Task Intelligent Humanoid Robot

1. Invoke Human Hand Env. Agent

2. Close Robot Gripper Res. Agent

3. Invoke Box Env. Agent

4. Open Robot Gripper

Move-to

Move-to

Box

Env. Agent

Open/Close

Human Hand

Env. Agent

Robot Gripper

Res. Agent

Activate

Activate

Skin Color Tracking

Beh. Agent

Visual Servoing

Beh. Agent

Primitive Agent

Type

Relationship

Simple Task Agent Operation


Current implementation l.jpg
Current Implementation Intelligent Humanoid Robot

  • ISAC is being designed primarily as a human service robot; to interact smoothly, naturally with, with people.

  • Several high level agents mediate the interaction: robot self agent, human agent, object agents.


Two high level agents l.jpg
Two High-Level Agents Intelligent Humanoid Robot

  • Human Agent: encapsulates what the robot knows about the human

  • Robot Self Agent: maintains knowledge about the internal state of the robot and can communicate this with the human


Human robot interaction desiderata l.jpg
Human-Robot Interaction Desiderata Intelligent Humanoid Robot

  • The robot is “user-friendly;” it has a humanoid appearance and can converse with a person.

  • A person can discover the robot’s abilities through conversation.

  • The person can evoke those abilities from the robot.

  • The robot can detect its own failures and report them to the person.

  • The person and the robot can converse on the robot’s internal state for failure diagnosis.


Slide42 l.jpg

Human-Robot Interaction Intelligent Humanoid Robot

A

IMA Primitive

Agent

DBAM

Hardware

Interface

Human

Interaction

Robot

Self

Agent

Human

Agent

Human

Robot

A

A

A

A

A

A

A

Software System


Robot self agent l.jpg
Robot Self Agent Intelligent Humanoid Robot

  • Human Interaction: maps the person's goals into robot action

  • Action Selection: activates primitive agents based on information from the human interaction component

  • Robot Status Evaluation: detects failure in primitive agents and maintains information about the overall state of the robot


Slide44 l.jpg

Human Agent Intelligent Humanoid Robot

Name

Time of Last Interact.

Last Phrase

Self Agent

Emotion

Module

Conversation

Module

Human Hand

Env. Agent

Activator

Module

Interrogation

Module

Pronoun

Module

Env.Agent 1

Description

Module

Env.Agent M

Task Agent 1

Task Agent 2

Task Agent N


Human interaction agents l.jpg
Human Interaction Agents Intelligent Humanoid Robot

  • Conversation Module

    • Interprets the human's input and generates responses to the human

    • Based upon the Maas-Neotek robots developed by Mauldin

    • Increases or decreases the activation level of a primitive agent

  • Pronoun Module

    • Resolves human phrases and to environment primitive agents.

    • acts as a pointer or reference for use by task primitive agents

    • points to other modules of the Robot Self Agent or to primitive agents for purposes of failure description

  • Interrogation Module

    • handles discussions about the robot's status involving questions from the human


Status evaluation agents l.jpg
Status Evaluation Agents Intelligent Humanoid Robot

  • Emotion Module

    • artificial emotional model used to describe the current state of the robot

    • provide internal feedback to the robot's software agents

    • fuzzy evaluator is used to provide a textual description for each emotion

  • Description Module

    • contains names and a description of the purpose of primitive agents

    • information about the status of primitive agents: active or inactive, and successful or unsuccessful


Action selection agents l.jpg
Action Selection Agents Intelligent Humanoid Robot

  • Activator Module is a “clearing-house” for contributing to the activation state of primitive agents

  • Other Robot Self Agent modules can adjust the activation level of primitive agents through the Activator Module


Slide48 l.jpg

Details of Agents: Intelligent Humanoid RobotIMA Component Level

Emotion Agent

Conversation Agent

Description Agent

Fuzzy

Text

Text In

Config

PA Abilities

Rep N

Rep 1

Rep 2

Description

Relationship

PA Names

Interpreter

Text Out

Rel 1

Rel 2

Rel N

Activator Agent

Pronoun Agent

Interrogation Agent

Rep 1

Rep 2

Rep N

Text In

Config

Why

Activator

What

Text In

Binding

Mechanism

Pointer

Link1

Link2

LinkN

Where


Human agent l.jpg

Speech Recognition Intelligent Humanoid Robot

Pointing Finger Detection

Central

Sequencing

Agents

From Person

From Camera Head

Skintone Detection

Speaker Identification

Face Detection

Infrared Motion Detection

From IR Sensor Array

Human Agent


Slide50 l.jpg

Human Intelligent Humanoid Robot

Agent

Robot Self

Agency

Text-To-

Speech

Human

ID

Human

Detect.

HandShake

Task

Game

Task

HandOff

Task

Voice

ID

Face

Detect

Skin-Tone

Tracking

Hand

Arm

Speech

Rec,

Pan/Tilt

Head

Color

Img

Testbed System


Visual attention featuregate l.jpg
Visual Attention: FeatureGate Intelligent Humanoid Robot

  • A model of human visual attention, developed by Kyle R. Cave, that is consistent with empirical observation and experimental results.

  • Activates locations in the visual field based on their local salience and discriminability with respect to sets of features.

  • A pyramid structure where info is gated to the next (smaller) level as a function of local activations


Decision making with featuregate l.jpg
Decision Making with FeatureGate Intelligent Humanoid Robot

  • FeatureGate must determine if no target objects are present, one target object is present or multiple target objects are present.

  • The Bayes Test for Multiple Hypotheses is adapted to FeatureGate to decide which of the above hypotheses has occurred. The object of this test is to determine which hypothesis has the minimum cost for a given result.

  • Probability density functions (pdfs), a priori probabilities and cost functions determine the total cost for each hypothesis.

  • Since a priori probabilities and cost functions are held constant, the pdfs are the only unknowns.


Information used in hypothesis testing l.jpg
Information Used in Hypothesis Testing Intelligent Humanoid Robot

  • The most efficient data to create pdf’s for hypothesis testing is contained in levels 2, 3 and 4 out of levels 0-9.

  • At these three levels, the top ten activation values that are at least 65% of the maximum activation value are stored. The Euclidean distances between the units containing all of these stored activation values are computed.

  • The pdf’s at each level for single target object presence and for multiple target object presence are formulated from the Euclidean distances measured in test trials for each hypothesis.

  • The pdf for absence of target objects is a Gaussian distribution calculated with a user-defined deviation.


Featuregate test for multiple hypotheses l.jpg
FeatureGate Test for Multiple Hypotheses Intelligent Humanoid Robot

  • Choose Hj for which Cj is a minimum:

  • Hj is hypothesis j.

  • Cj is the total cost for hypothesis j.

  • Cj,i is the cost of choosing hypothesis j given hypothesis i is true.

  • pi,l(y) is the probability density function for hypothesis i at level l.

  • P(Hi) is the probability that hypothesis i occurs.


Agent coupling and composition l.jpg
Agent Coupling and Composition Intelligent Humanoid Robot

  • Agents:

  • can subsume others, suppress or inhibit, or initiate, terminate, send messages,

  • can be sequenced deterministically or stochastically,

  • can interrupt each other, or

  • can possess activation vectors that can be modified by other agents.

  • Through agent relationships weighted links can be forged or removed.


Camera head control l.jpg
Camera Head Control Intelligent Humanoid Robot

Image Sensory

Visual Attention Network

Retinal Motion Signal

Left

Right

Saccade

Eye Motion Center

Smooth Pursuit

VOR

Pan

OKR

Vergence

Motor Command

Motor Signal

Tilt

Camera Head (Eyes) Controller


Features l.jpg
Features Intelligent Humanoid Robot

Vergence

  • Fixating on same object

  • Clues for target Segmenting

Saccade

  • Quickly shifting gaze at once

Smooth Pursuit

  • Smoothing movement

  • Minimizing target slip

OKN

  • Stabilizing target on visual field

VOR

  • Keeping eyes on target while moving head


Slide58 l.jpg

Methods Intelligent Humanoid Robot

Vergence

Saccade

  • Neural Net-based Saccadic Map

  • Color Clue providing Error for

  • Online Weights Adjustment

  • Edge-based Zero Disparity Filter (Coombs & Brown)

  • Logical AND of left/right vertical-edge image

Smooth Pursuit

  • World Coordinates : -- Predictive Control

  • (Bar-Shalom & Fortmann)

  • Image Coordinates : RBF Net Tracking Control (Looney)

OKN

VOR

  • Compensation for Head Velocity and Acceleration

  • Compensation for Target

  • Movement in Visual Flow

  • Field

* Under investigation/design


Slide59 l.jpg

How do humans reach to grasp? Intelligent Humanoid Robot

  • First, humans fixate on the target

  • Fixation creates an object-centered frame

  • Finally, humans move the hand relative to the object-centered frame.

  • This approach is adopted in our Humanoid


Fps fixation point servoing l.jpg

Gripper Intelligent Humanoid Robot

Right

Forward

A1

B2

B1

A2

Down

C1

D2

C2

D1

Left Camera

Right Camera

FPS: Fixation Point Servoing

  • The manipulation frame is fixated at the object

  • Servoing: IF A1A2 THEN (Down, Right, Back


Slide61 l.jpg

Schema Intelligent Humanoid Robot

Control (L)

Actv.

Agent Policy (Rule)

Eval.

Activation

Relationship

Agent

Engine

Object.

Geom..

State

Rep.

Eval.

Eval.

Actv.

Eval.

Eval.

Force

Sensor (L)

Force

Sensor (R)

Actv.

Eval.

Eval.

Actv.

Left Arm

Position

Left

Gripper

Right

Gripper

Right Arm

Position

Schema

Control (R)

Visual Servoing in IMA

Resource Comps.

Evaluators/Activators


Slide62 l.jpg

Schema Intelligent Humanoid Robot

Control/ Arm

FPS

Actv.

Agent Policy (Rule)

Agent

Engine

State

Rep.

Eval.

Eval.

Actv.

Actv.

Eval.

Eval.

Activation

Relationship

Track

Object

Arm

Position

Actv.

Grasping an Object

Schema

Control/ P.T.

Pan Tilt

Positions

Gripper

Evaluators/Activators

Resource Comps.


Slide63 l.jpg

PneuHand Intelligent Humanoid Robot

  • 3 Fingers

  • Opposable Thumb

  • Pnuematic

  • Low Cost

  • Applications:

    • Grasping Objects

    • Gestures


In current development l.jpg
In Current Development Intelligent Humanoid Robot

  • Sensory EgoSphere (SES)

  • Database Associative Memory (DBAM)

  • Learning Algorithms

  • System Status Evaluation


A working memory sensory egosphere l.jpg
A Working Memory: Sensory EgoSphere Intelligent Humanoid Robot

"a two-dimensional spherical surface that is a map of the world as seen by an observer at the center of the sphere. Visible points on regions or objects in the world are projected on the EgoSphere wherever the lines of sight from a sensor at the center of the EgoSphere to points in the world intersect the surface of the sphere.”

J. S. Albus


Sensory egosphere l.jpg
Sensory EgoSphere Intelligent Humanoid Robot


Sensory egosphere67 l.jpg
Sensory EgoSphere Intelligent Humanoid Robot

  • Store for sensory data that can be localized, (e.g., visual imagery, localized sound, motion vectors, distance to closest surface,etc.)

  • Data is time stamped.

  • SES is a multiresolution, 2Dx1D map of the robot's environment indexed by elevation, azimuth, and time.

  • Tessellated as a geodesic dome.


The ses as working memory l.jpg
The SES as Working Memory Intelligent Humanoid Robot

  • Sensory data can be copied to the SES by any agent that finds it important.

  • Simple descriptors (e.g. Fourier magnitude components of edge map, LPC coefficients) computed for each input can be stored. New input from the same location is compared to detect changes.

  • Associations with long term memory records can be formed.

  • Long term memories are consolidated by the repetition of such associations.


Database associative memory l.jpg
Database Associative Memory Intelligent Humanoid Robot

  • The set of agents form a database of records.

  • The links between records are associations.

  • Records of sensory data can be associated by space-time proximity or sequence, co-occurrence during a specific task.

  • Includes all SES records.

  • Long term memories indicated by strengths of links to associated agents and data records.


Learning l.jpg
Learning Intelligent Humanoid Robot

  • ISAC should modify programmed skills through experience and acquire new skills by example or trial and error.

  • Learning includes modifying the links between agents in the DBAM (modification of sequences in logical layer 2).

  • When stuck or in a learning mode, non-maximally activated agents could be initiated. The results analyzed (by robot or teacher) and activations altered accordingly through a Reinforcement Learning protocol.

  • Could use, e.g., spreading activation.


System status self evaluation l.jpg
System Status Self Evaluation Intelligent Humanoid Robot

  • Most agents have a vector that indicates their current status and/or a measure of confidence in their most recent results.

  • Calling agents can use these to detect faults.

  • A System Status Evaluation Agent keeps track of current problems and biases agent activations if necessary.

  • Could be used by RL / SA network.


ad