1 / 37

Multi-Agent System Communication Paradigms

Multi-Agent System Communication Paradigms.

dextra
Download Presentation

Multi-Agent System Communication Paradigms

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Multi-Agent System Communication Paradigms Abstract: Multi-Agent Systems seem like a good idea -- solve problems by using several different specialized agents who cooperate to reach a solution.  Aside from obvious problems (such as disagreements, inconsistencies, and composition), communication is a significant impediment.  Agents may be written by different people in different styles, with different objectives.  Simple communication protocols tend to be brittle and fail because not all agents my behave identically, and my respond in unanticipated ways.  Attempts to formalize and standardize communication, such as the one the Foundation for Intelligent Physical Agents (FIPA) developed using BDI (Belief, Desire, Intension) semantics, encountered difficulties that made implementation and conformance difficult.  Some of these problems stem from the FIPA standard's requirement that all agents model the beliefs, desires, and intentions of all other agents, which makes for a rather unnaturally omniscient society of agents.  One way to address this issue is to neglect the modeling of the "minds" of other agents, and model only the social commitments agents undertake (or fulfill) in the course of their conversations.  To achieve this, conversations are modeled as sequences of speech acts, where rules or policies (which may be considered social norms) translate from the speech acts to the instantiation or deletion of social commitments among the participant agents. Rob Kremer University of Calgary Department of Computer Science Calgary, CANADA kremer@cpsc.ucalgary.ca Commitment-based Conversations

  2. Simple Conversation: naive What time is it? It’s 2:00 CPSC 609.68/599.68: Agent Communications

  3. <nod> <nod> <nod> <nod> Simple Conversation: more complex What time is it? Just a sec, I’ll check It’s 2:00 Thanks CPSC 609.68/599.68: Agent Communications

  4. Go ask someone else. <ignore>(didn’t hear) Simple Conversation: failure What time is it? I don’t know, my watch stopped CPSC 609.68/599.68: Agent Communications

  5. State Machine Eg: “call for proposals” A MIP-net (Multi-agent Interaction Protocol net) – combining two A-nets (Agent nets) and one IP-net (Interaction Protocol net). Reference: Sea Ling & Seng Wai Loke. A Formal Compositional Model of Multiagent Interaction. AAMAS’03 – International Conference on Autonomous Agents and Multi-Agent System, July 14-18, 2003, Melbourne, Australia. ACM, 2003. Also available: http://delivery.acm.org/10.1145/870000/860791/p1052-ling.pdf?key1=860791&key2=2595358611&coll=&dl=ACM&CFID=15151515&CFTOKEN=6184618 Ling & Loke 2003, p.1053 CPSC 609.68/599.68: Agent Communications

  6. BDI • Belief • Environment • Desire • Goals • Intention • The current desire(s) chosen by the selection function with the beliefs and desires as input CPSC 609.68/599.68: Agent Communications

  7. Motivation • Agents aren’t objects – therefore they must communicate in order to influence other agents (as opposed to invoking methods on other agents) • We treat such communications just like other actions, and call it “speech act theory”1,2 1John Austin. How to Do Things With Words. Oxford University Press: Oxford, England, 1962 2John Searle, Speech acts: An Essay in the Philosophy of Language. Cambridge University Press: Cambridge, England, 1969. CPSC 609.68/599.68: Agent Communications

  8. Speech Acts • Performative verbs • Eg: request, inform, promise • Aspects: • Locutionary act: making an utterance • Illocutionary act: action performed in saying something • Perlocution: the effect of the act • Felicit conditions: • There exist an accepted procedure, and circumstances and persons must be specified • Must be executed correctly and completely • Must be sincere, and must have uptake CPSC 609.68/599.68: Agent Communications

  9. Example Request • Normal I/O conditions: • Hearer not deaf • Not uttered in a play or film • Preparatory conditions: • Speaker must correctly choose the act • Hearer must be able • Speaker must believe the hearer is able • Hearer wouldn’t have done the act anyway • Sincerity conditions: • Not sarcasm, etc. CPSC 609.68/599.68: Agent Communications

  10. Types of Speech Acts • Representatives: commits the speaker to the truth of a proposition • Eg: Informing • Directives: attempt to get the hearer to do something • Eg: Requesting • Commissives: commit to a course of action • Eg: Promising • Expressives: express some psychological state • Eg: Thanking • Declarations: effects some change in an institutional state of affairs • Eg: Declaring war CPSC 609.68/599.68: Agent Communications

  11. Rational Action • Speech act theory can be considered as a specialization of a more general theory of rational action1 • Eg: “A request is an attempt on the part of spkr,by doing , to bring about a state where, ideally, (i) addr intends  (relative to the spkr still having that goal, and addr still being helpfully inclined to spkr), and (ii) addr actually eventually does , or at least brings about a state where addr believes it is mutually believed that it wants the ideal situation” 1,p.241 1P. R. Cohen and H. J. Levesque. Rational interaction as the basis for communication. In P. R. Cohen, J. Morgan, and M. E. Polleck (eds), Intentions in Communications, pp 221-56. The MIT Press: Cambridge, MA, 1990. CPSC 609.68/599.68: Agent Communications

  12. FIPA Messages CPSC 609.68/599.68: Agent Communications See FIPA standard SC00061G

  13. FIPA Performatives (informally) See FIPA standard SC00037J CPSC 609.68/599.68: Agent Communications

  14. Semantic content Agent Proposition Feasibility Precondition Ujφ  Uj¬φ (where Uj means “j is uncertain about φ”) Rational Effect Bjφ Bj¬φ “j believes φ” FIPA Semantics <i, inform (j, φ )> FP: Biφ ¬ Bi(Bifjφ  Uifjφ) RE: Bjφ CPSC 609.68/599.68: Agent Communications

  15. FIPA inform semantics • Formally:<i, inform (j, φ )> FP: Biφ ¬ Bi(Bifjφ  Uifjφ) RE: Bjφ • Example:(inform :sender (agent-identifier :name i) :receiver (set (agent-identifier :name j)) :content "weather (today, raining)“ :language Prolog) CPSC 609.68/599.68: Agent Communications

  16. FIPA request semantics • Formally:<i, request (j, a )> FP: FP(a) [i\j]  Bi Agent (j, a) ¬Bi Ij Done (a) RE: Done (a)FP(a) [i\j] denotes the part of the FPs of a which are mental attitudes of i. • Example:(request :sender (agent-identifier :name i) :receiver (set (agent-identifier :name j)) :content "open \"db.txt\" for input“ :language vb) CPSC 609.68/599.68: Agent Communications

  17. Plan CPSC 609.68/599.68: Agent Communications

  18. BDI Agent CPSC 609.68/599.68: Agent Communications

  19. Basics • First order model logic with identity • Three primitive attitudes: Bip i (implicityly) believes p Uip i is uncertain about p but thinks that p is more likely than ¬p Cip (choice) i desires that p currently holds • Other attitudes: PGip i has p as a persistent goal (desire?) Iip i has the intention to bring about p CPSC 609.68/599.68: Agent Communications

  20. Actions a1 ; a2sequence in which a2 follows a1. a1 | a2 nondeterministic choice in which either a1 happens or a2 happens, but not both. Feasible(a,p) a can take place and, if it does, p will be true just after that. (Feasible(a)  Feasible(a, True) Possible(p)  a.Feasible(a,p). Done(a,p) a has just taken place and p was true just before that. (Done(a)  Done(a,True) Agent(i,a) i is the only agent that ever performs (in the past, present or future) the actions a. Single(a) a is not a sequence. ¬Single(a1;a2), but Single(a1|a2) iff Single(a1)/\Single(a2) CPSC 609.68/599.68: Agent Communications

  21. Abbreviations Bifip  Bip \/ Bi¬p Brefi lx(x)y.Bi(lx(x) = y). Agent i believes that it knows the (x which is) . Uifip  Uip \/ Ui¬p Urefi lx(x)y.Ui(lx(x) = y). ABn,i,jp  BiBjBi …p. n is the number of B operators alternating between i and j. CPSC 609.68/599.68: Agent Communications

  22. Property: intending to achieve a RE (x . Bi ak=x) // there exists an action /\ RE(ak)=p // who’s RE is p /\ ¬Ci¬Possible(Done(ak)) // that i thinks should be done (Iip IiDone(a1| … |an) // then if i intends p, then i intents one of the act that can achieve it Where a1, ..., an are all the acts of type ak “If I intent to break the vase, then I intent to either drop it, or smash it with a hammer” CPSC 609.68/599.68: Agent Communications

  23. Property: satisfiability of intent = IiDone(a)  BiFeasible(a) \/ IiBiFeasible(a) If agent i intends a, then it needs to believe a is feasible or at least have the intent to discover if a is feasible “If I intend to build a perpetual motion machine, then I have to believe it’s possible or to at least discover if it’s possible.” CPSC 609.68/599.68: Agent Communications

  24. Property: intent of act implies intent of RE = IiDone(a)  IiRE(a) If agent i intends a, then it also intents the RE of a “If I intend to drop the vase, then I also intend to break the vase” CPSC 609.68/599.68: Agent Communications

  25. Property: observing an act = Bi(Done(a) /\ Agent(j,a)  IjRE(a)) If agent i observes j doing a, then i will come to believe the j intends the RE of a. “If I see Jane hammering the vase, then I believe that Jane intends to break the vase” CPSC 609.68/599.68: Agent Communications

  26. BDI Issues • Sometimes end up with expressions like “I believe that you believe that I believe that you believe that…” • Calls on agents to have a omniscient view of all the other agents • FIPA is based on BDI semantics CPSC 609.68/599.68: Agent Communications

  27. Social Commitments Many utterances imply some sort of “conversational” social commitment • Eg: a request commits the receiver to reply Other social commitments are negotiated • Eg: “wash my car” Basic Agent body: • When an agent observes (or sends or receives) a message, it uses policies (rules) as social norms that generate (or delete) social commitments • Agents spend their free time trying to fulfill the social commitments for which it is a debtor CPSC 609.68/599.68: Agent Communications

  28. FIPA Performatives inform proxy propagate request cancel confirm disconfirm propose inform-ref query-if request-when agree accept-proposal query-ref call-for-participation request-whenever subscribe failure not-understood reject-proposal refuse Commitment-based Conversations

  29. FIPA Performatives performative Added catagories inform proxy propagate ack reply request cancel reply-propose-discharge affirmative-reply confirm disconfirm propose inform-ref query-if request-when agree accept-proposal negative-reply query-ref call-for-participation request-whenever subscribe failure not-understood reject-proposal refuse Commitment-based Conversations

  30. FIPA Performatives performative Added catagories Arranged in a lattice inform proxy propagate ack reply request cancel reply-propose-discharge affirmative-reply confirm disconfirm propose inform-ref query-if request-when agree accept-proposal negative-reply query-ref call-for-participation request-whenever subscribe failure not-understood reject-proposal refuse Commitment-based Conversations

  31. FIPA Performatives performative Added catagories Arranged in a lattice Extended inform proxy propagate ack reply request cancel reply-propose-discharge affirmative-reply propose-discharge nack done confirm disconfirm propose inform-ref query-if request-when agree notify accept-proposal negative-reply query-ref call-for-participation request-whenever subscribe timeout failure not-understood reject-proposal refuse Commitment-based Conversations

  32. Policies Commitment-based Conversations

  33. Policies Commitment Operators Commitment-operator Operator: add Commitment: (receiver,sender,ack) P-inform Commitment-operator Operator: delete Commitment: (sender,receiver,ack) P-ack Commitment-operator Operator: add Commitment: (reciever,sender,reply) P-request Commitment-operator Operator: delete Commitment: (receiver,sender,reply) P-reply Commitment-operator Operator: add Commitment: (receiver,sender,content) P-agree Commitment-operator Operator: deleteCommitment: (receiver,sender,content) P-confirm Policies  Commitment Operators Performatives Social Commitments Performative action communication-act Inform Ack ack reply Reply Request Agree Unspecified action Confirm Commitment-based Conversations

  34. Can you attend this meeting? (performative: request, content: attend(Bob,x)) Sure... (performative: agree, content: request|attend(Bob,x)) (performative: ack, content: agree|request|attend(Bob,x)) (nod) I’m here (performative: propose, content: discharge|attend(Bob,x)) (nod) (performative: ack, content: discharge|attend(Bob,x)) (nod) Thanks for coming. (performative: accept-proposal, content: discharge|attend(Bob,x)) (performative: ack, content: accept-proposal|discharge|attend(Bob,x)) Example: Informally Alice Bob Commitment-based Conversations

  35. inform ack(Bob,Alice,x) request reply(Bob,Alice,x) ack inform ack ack(Alice,Bob,x) reply agree act(Bob,Alice,x) ack ack(Alice,Bob,x) inform ack reply reply-propose-discharge(Alice,Bob,x) propose/discharge propose-discharge(Bob,Alice,x) ack ack(Bob,Alice,x) ack inform reply agree accept-proposal ack Example: Perf. Lattice and Commitments Bob Alice Commitment-based Conversations

  36. decide(Bob,Alice,x)` consider(Alice,Bob,x) evaluate(Alice,Bob,x) accept(Bob,Alice,x) Example: Implementation Details Bob Alice inform ack(Bob,Alice,x) request/* reply(Bob,Alice,x) ack inform ack ack(Alice,Bob,x) reply agree/request|* act(Bob,Alice,x) ack inform ack ack(Alice,Bob,x) reply propose/discharge|* propose-discharge(Bob,Alice,x) reply-propose-discharge(Alice,Bob,x) ack ack inform ack(Bob,Alice,x) reply agree accept-proposal/propose/discharge|* ack Commitment-based Conversations

  37. Conclusions • Arranging performatives in a lattice simplifies interpretation • Messages (performatives) → policies → commitment operators → shared social commitments • Easily observable by 3rd parties • Agents do not have the be implemented in the SC style (eg. could be BDI internally) • Turn taking arrises naturally Commitment-based Conversations

More Related