Do software agents know what they talk about
Download
1 / 40

Do software agents know what they talk about? - PowerPoint PPT Presentation


  • 146 Views
  • Uploaded on

Do software agents know what they talk about?. Agents and Ontology dr. Patrick De Causmaecker, Nottingham, March 7-11 2005. Deductive reasoning agents. Logical programming. First order logic Example: Prolog Example: Rule based systems Example: Constraint Satisfaction. First order logic.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' Do software agents know what they talk about?' - deirdre-davenport


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Do software agents know what they talk about

Do software agents know what they talk about?

Agents and Ontology

dr. Patrick De Causmaecker, Nottingham, March 7-11 2005



Logical programming
Logical programming

  • First order logic

  • Example: Prolog

  • Example: Rule based systems

  • Example: Constraint Satisfaction

Agents and Ontology [email protected]


First order logic
First order logic

  • Predicates on atoms, not on predicates.

  • Quantifiers relate atoms

  • Grelling’s paradox (cannot be expressed in first order logic)

  • If an adjective truly describes itself, call it “autological", otherwise call it "heterological". For example, "polysyllabic" and "English" are autological, while "monosyllabic" and "pulchritudinous" are heterological. Is "heterological" heterological? If it is, then it isn't; if it isn't, then it is.”

Agents and Ontology [email protected]


Example prolog
Example: Prolog

  • http://www.ugosweb.com/jiprolog/

Agents and Ontology [email protected]


father(terach,abraham).

father(terach,nachor).

father(terach,haran).

father(abraham,isaac).

father(haran,lot):-!.

father(haran,milcah).

mother(sara,isaac).

male(terach).

male(abraham).

male(nachor).

male(haran).

male(isaac).

male(lot).

female(sarah).

female(milcah).

female(yiscah).

likes(X,pome).

son(X,Y):-father(Y,X),male(X).

daughter(X,Z):-father(Z,X),female(X).

granfather(X,Z):-father(X,Y),father(Y,Z).

Agents and Ontology [email protected]


Towers of Hannoi

hanoi(1, A,B,C,[[A,B]]):-!.

hanoi(N, A,B,C,Moves):-

N1 is N - 1,

hanoi(N1, A,C,B,Ms1),

hanoi(N1, C,B,A,Ms2),

append(Ms1, [[A,B]|Ms2], Moves),

!.

Agents and Ontology [email protected]


Example rulebased systems
Example: Rulebased systems

http://www.expertise2go.com/download/demo.html

Agents and Ontology [email protected]


RULE [Is the battery dead?]

If [the result of switching on the headlights] = "nothing happens" or

[the result of trying the starter] = "nothing happens"

Then [the recommended action] = "recharge or replace the battery"

RULE [Is the car out of gas?]

If [the gas tank] = "empty"

Then [the recommended action] = "refuel the car"

Agents and Ontology [email protected]


RULE [Is the battery weak?]

If [the result of trying the starter] : "the car cranks slowly" "the car cranks normally" and

[the headlights dim when trying the starter] = true and

[the amount you are willing to spend on repairs] > 24.99

Then [the recommended action] = "recharge or replace the battery"

Agents and Ontology [email protected]


RULE [Is the car flooded?]

If [the result of trying the starter] = "the car cranks normally" and

[a gas smell] = "present when trying the starter"

Then [the recommended action] = "wait 10 minutes, then restart flooded car"

Agents and Ontology [email protected]


RULE [Is the gas tank empty?]

If [the result of trying the starter] = "the car cranks normally" and

[a gas smell] = "not present when trying the starter"

Then [the gas tank] = "empty" @ 90

Agents and Ontology [email protected]


PROMPT [the result of trying the starter] Choice CF

"What happens when you turn the key to try to start the car?"

"the car cranks normally"

"the car cranks slowly"

"nothing happens"

Agents and Ontology [email protected]


PROMPT [a gas smell] MultChoice CF

"The smell of gasoline is:"

"present when trying the starter"

"not present when trying the starter"

Agents and Ontology [email protected]


PROMPT [the result of switching on the headlights] MultChoice CF

"The result of switching on the headlights is:"

"they light up"

"nothing happens"

PROMPT [the headlights dim when trying the starter] YesNo CF

"Do the headlights dim when you try the starter with the lights on?"

Agents and Ontology [email protected]


Example constraint satisfaction
Example: Constraint Satisfaction MultChoice CF

http://kti.ms.mff.cuni.cz/~bartak/constraints/index.html

Agents and Ontology [email protected]






Deductive reasoning
Deductive reasoning MultChoice CF

  • Intelligent behaviour can be reached by providing the system with a symbolic representation of its environment and allow it to manipulate this representation syntactically

  • The symbolic representation is a set of logical formulas. The manipulation is deduction, or theorem proving.

Agents and Ontology [email protected]


Interp: MultChoice CF

Pixel manipulation

  • Knowledge bank: belief:

    • dist(mij,d1) = 90 cm

    • door(d1)

D020

Plan

STOP

Action

BREAK!

Agents and Ontology [email protected]


Two problems
Two problems MultChoice CF

  • Transduction

    • Sufficiantly fast transformation of observations in an adequate symbolic representation.

  • Representation/reasoning

    • The symbolic representation as a basis for the manipulation process. Both should be sufficiently fast.

Agents and Ontology [email protected]


Ai aproach
AI aproach MultChoice CF

  • Perception:

    • Vision, speach, natural language, learning,…

  • Representation

    • Knowledge representation tasks, automatic reasoning, automatic planning

  • A lot of work has been done, results are still very limited.

Agents and Ontology [email protected]


Agents as theorem provers
Agents as theorem provers MultChoice CF

  • The internal state of the agent is a database of first order predicates:

  • This database contains all beliefs of the agent.

Open(valve221)

Temperature(reactor4726,321)

Pressure(tank776,28)

Agents and Ontology [email protected]


Agents as theorem provers1
Agents as theorem provers MultChoice CF

  • Beliefs are not exact, complete.

  • Interpretation may be faulty.

  • Still these predicates are all the agent can walk on.

Agents and Ontology [email protected]


Agents as theorem provers2
Agents as theorem provers MultChoice CF

  • Formally

  • L = {all first-order predikaten}

  • D = (L) = {all L databases}

  • , 1, 2,…  D

  • = {deductionrules of the agent}

      means that formula  from L can be proven from database  using rules .

Agents and Ontology [email protected]


Agents as theorem provers3
Agents as theorem provers MultChoice CF

  • The agent:

    • The perception function:

      • see : S -> Per

    • The adaptation of the internal state:

      • next : D  Per -> D

    • The action function:

      • action : D -> Ac

Agents and Ontology [email protected]


  • Function Action by proof MultChoice CF

  • Function action( :D) return een actie Ac

  • begin

  • for each   Ac

  • if  Do() then return 

  • end for

  • for each   Ac

  • if  Do() then return 

  • end for

  • return null

  • end

Agents and Ontology [email protected]


Example: the vacuum cleaning agent MultChoice CF

Agents and Ontology [email protected]


Vacuum cleaning
Vacuum cleaning MultChoice CF

  • The world

  • Previous information changes

In(x,y)

Dirt(x,y)

Facing(d)

old() = {P(t1,…,tn) |P  {In,Dirt,Facing} en P(t1,…,tn)  }}

Agents and Ontology [email protected]


Vacuum cleaning1
Vacuum cleaning MultChoice CF

  • The function new generates new knowledge:

    • new : D  Per -> D (exercise)

  • One can define next as:

    • next(,p) = ( \old())  new(,p)

Agents and Ontology [email protected]


Vacuum cleaning2
Vacuum cleaning MultChoice CF

  • Deductionrules are as

    • (…)  (…)

  • “If  is consistent with the content of the database, conclude ”

  • Rule 1: arbeit

    • In(x,y)  Dirt(x,y)  Do(suck)

  • Rule 2:bewegen

    • In(0,0)  Facing(north)   Dirt(0,0)  Do(forward)

    • In(0,1)  Facing(north)   Dirt(0,0)  Do(forward)

    • In(0,2)  Facing(north)   Dirt(0,0)  Do(turn)

    • In(0,2)  Facing(east)  Do(forward)

Agents and Ontology [email protected]


Conclusions
Conclusions MultChoice CF

  • Rather impractical…

  • Agent must try do determine its optimal action by reasoning.

  • This takes time (deductive systems are slow).

  • The world can have changed…

  • “calculative rationality”: agent decides for the optimal action at the time of the start of the reasoning process.

  • Not allways acceptable

Agents and Ontology [email protected]


Other problems
Other problems MultChoice CF

  • Logic is elegant but slow

  • The see functie is in a difficult, poorly understood, sector of AI.

  • The vacuum cleaning problem was already difficult to describe!

Agents and Ontology [email protected]


Agent geori nterd programming agent0 shoham 1993
Agent georiënterd programming: Agent0 (Shoham 1993) MultChoice CF

  • Desire, belief, intention

  • In Agent0 an agent is

    • capabilities,

    • Initial beliefs

    • Initial commitments

    • Rules to deduct commitments (commitment rules).

Agents and Ontology [email protected]


Agent0
Agent0 MultChoice CF

  • A commitment rule is

    • A message condition

      • To be compared with received messages

    • A mentale condition

      • To be compared with the beliefs and intentions

    • An action actie

      • To be selected if appropriate

Agents and Ontology [email protected]


Agent01
Agent0 MultChoice CF

  • Two kinds of actions:

    • Communicative

    • Private

  • Three kinds of messages:

    • Requests for action

    • Unrequests to stop action

    • Inform for infomation

Agents and Ontology [email protected]


COMMIT( MultChoice CF

(agent, REQUEST, DO(time, action)) ;;; boodschapvoorwaarde

(B,[now, Friend agent] AND CAN(self, action)

AND NOT [time, CMT(self, anyaction)]), ;;; mentale voorwaarde

self, DO(time,action)

)

Agents and Ontology [email protected]


messages in MultChoice CF

Initialize

Beliefs

Update beliefs

Commitments

Update commitments

Abilities

Execute

messages out

internal actions

Agents and Ontology [email protected]


ad