1 / 30

A Cognitive Framework for Delegation to an Assistive User Agent

A Cognitive Framework for Delegation to an Assistive User Agent. Karen Myers and Neil Yorke-Smith Artificial Intelligence Center, SRI International. Overview. CALO: a learning cognitive assistant User delegation of tasks to CALO Delegative BDI agent framework Goal adoption and commitments

albin
Download Presentation

A Cognitive Framework for Delegation to an Assistive User Agent

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A Cognitive Framework for Delegation to an Assistive User Agent Karen Myers and Neil Yorke-Smith Artificial Intelligence Center, SRI International

  2. Overview • CALO: a learning cognitive assistant • User delegation of tasks to CALO • Delegative BDI agent framework • Goal adoption and commitments • Summary and research issues

  3. CALO: Cognitive Assistant that Learns and Organizes • CALO supports a high-level knowledge worker • Understands the “office world”, your projects and schedule • Performs delegated tasks on your behalf • Works with you to complete tasks • Stays with you (and learns) over long periods of time • Learns to anticipate and fulfill your needs • Learns your preferred way of working Track execution of Help manage time and project tasks commitments Perform tasks in collaboration with the user

  4. CALO Year 2

  5. Overview • CALO: a learning cognitive assistant • User delegation of tasks to CALO • Delegative BDI agent framework • Goal adoption and commitments • Summary and research issues

  6. Delegation May Lead to Conflicts • Focus on delegation of tasks from user to CALO • Not on tasks to be performed in collaboration • One aspect of CALO’s role as intelligent assistant • CALO cannot act if conflicts over actions • Conflicts in tasks • “purchase this computer on my behalf” • “register me for the Fall Symposium” • Conflicts in guidance • “always ask for permissions by email” • “never use email for sensitive purchases”

  7. Conflicts in User’s Desires • “I wish to be thin” • “I wish to eat chocolate” • But Richard Waldinger’s scotch mocha brownies are full of calories •  conflict between incompatible desires • User’s desires conflict with each other • Humans seem to have no problem with such conflicts • CALO must recognize and respond appropriately

  8. Other Types of Conflicts • Current and new commitments • Currently CALO is undertaking tasks to: • Purchase an item of computer equipment • Register user for a conference • Now user tasks CALO to register for a second conference • Set of new goals is logically consistent and coherent • But infeasible because insufficient discretionary funds • Commitments and advice • User tasks CALO to schedule visitor’s seminar in best conference room • Existing advice: “Never change a booking in the auditorium without consulting me” • New goal and existing advice are inconsistent

  9. The BDI Framework • CALO’s ability to act is based on BDI framework • Beliefs = informational attitudes about the world • Desires = motivational attitudes on what to do • Intentions = deliberative commitments to act • Realized in the SPARK agent system • Hierarchical, procedural reasoning framework • BDI components in SPARK represented as: • Facts (beliefs) • Intentions (goals/intentions) • Desires are not represented • Procedures are plans to achieve intentions

  10. Desires vs. Goals • Both are motivational attitudes • Desires may be neither coherent (with beliefs) nor consistent (with each other) • Goals must be both • Desires are ‘wishes’; goals are ‘wants’ • “I wish to be thin and I wish to eat chocolate” • “I want to have another of Richard’s brownies” • Desires lead to goals • CALO’s primary desire: satisfy its user • Secondary desires→goals to do what user asks

  11. vital for CALO ‘BDI’ Agents are Really ‘BGI’ • Decision theory emphasizes B and D • AI agent theory emphasizes B and I • In most BDI literature, ‘Desires’ and ‘Goals’ are confounded • In practice, focus is on: • goal and then intention selection • option generation, and plan execution and scheduling • Focus has been much less on: • deliberating over desires • goal generation • advisability

  12. The Problem with BGI • When Desires and Goals are unified into a single motivational attitude: • Can’t support conflicting D/G (and D/B) • Hard to express goal generation • Hard to diagnose and resolve conflicts • Between D/G and I, and between G, I, and plans • Hard to handle conflicts in advice • How can CALO make sense of the user’s taskings in order to act upon them? • How can CALO recognize and respond to (potential) conflicts?

  13. Overview • CALO: a learning cognitive assistant • User delegation of tasks to CALO • Delegative BDI agent framework • Goal adoption and commitments • Summary and research issues

  14. Cognitive Models for Delegation user agent satisfy all tasks Buser Bagent alignment Belief + + Duser Dagent Desire (do assigned tasks) Candidate Goals decisionmaking + delegation Guser GCagent refinement Adopted Goals Goal goal adoption GA

  15. agent AG AE advice failure conflicts B GA GC I execute D sub-goaling revision B G Delegative BDI Agent Architecture user Goal Advice Execution Advice Candidate Goals Adopted Goals Intentions

  16. Overview • CALO: a learning cognitive assistant • User delegation of tasks to CALO • Delegative BDI agent framework • Goal adoption and commitments • Summary and research issues

  17. Requirements on Goal Adoption • Self-consistency: GA must be mutually consistent • Coherence: GA must be mutually consistent relative to the current beliefs B • Feasibility: GA must be mutually satisfiable relative to current intentions I and available plans • Includes resource feasibility • Reasonableness: GA should be mutually ‘reasonable’ with respect to current B and I • Common sense check: did you really mean to purchase a second laptop computer today?

  18. Responding to Conflicting Desires • Goal adoption process should admit: • Adopting, suspending, or rejecting candidate goals • Modifying adopted goals and/or intentions • Modifying beliefs (by acting to change world state) • Example: User desires to attend a conference in Europe but lacks sufficient discretionary funds • shorten a previously scheduled trip • cancel the planned purchase of a new laptop • or apply for a travel grant from the department

  19. Combined Commitment Deliberation • Goal adoption • Adopted Goals  Candidate Goals ( Desires) • Intention reconsideration • Extended agent life-cycle • Non-adopted Candidate Goals • Execution problems with Adopted Goals • Propose combined commitment deliberation mechanism • Based on agent’s deliberation over its mental states • Bounded rationality: as far as the agent believes and can compute

  20. commitment deliberation decide on response perform actions BDI Control Cycle world state changes identify changes to mental state

  21. commitmentdeliberation Mental State Transitions decide observe act • Current mental state S = (B,GC,GA,I) • Omit D since suppose single “satisfy user” desire • Outcome of deliberation is new state S' • Possible new transitions: • Expansion adopt additional goal • No modification to existing goals or intentions • Revocation drop adopted goal + intention • To enable a different goal in the future • Proactive create new candidate goal and adopt it • To enable a current candidate goal in the future • Plus standard BGI transitions • E.g. drop an intention due to plan failure

  22. Goals: User-specified value/utility Can be time-varying User-specified priority User-specified deadline Estimate cost to achieve Level of commitment so far For adopted goals Intentions: Implied value/utility Cost of change Deliberative effort Loss of utility Delay Level of commitment Level of effort so far E.g. estimated % complete Estimated cost to complete Estimated prob. success Goal and Intention Attributes

  23. Making the Best Decision • S→S' transition as multi-criteria optimization • Maximize (minimize) some combination of criteria over S • Can be simple or complex • Bounded rationality • Simple default strategy, customizable by user • Advice acts as constraints  constrained (soft) multi-criteria optimization problem • “Don’t drop any intention > 70% complete” • Assistive agent can consult user if no clear best S' • “Should I give up on purchasing a laptop, in order to satisfy your decision to travel to both conferences?” • Learn and refine model of user’s preferences

  24. Example • Candidate goals: • c1: “Purchase a laptop” • c2: “Attend AAAI” • Adopted goals and intentions: • g1 with intention i1: “Purchase a high-end laptop using general funds” • g2 with intention i2: “Attend AAAI and its workshops, staying in conference hotel” • New candidate goal from user: • c3: “Attend AAMAS” (high priority) • Mental state S = (B, {c1,c2,c3}, {g1,g2}, {i1,i2})

  25. Example (cont.) • CALO finds cannot adopt c3 • {g1,g2,g3} resource contention – insufficient general funds • Options include: • Do not adopt c3 (don’t attend AAMAS) • Drop c1 or c2 (laptop purchase or AAAI attendance) • Modify g2 to attend only the main AAAI conference • But changing i2 incurs a financial penalty • Adopt a new candidate goal c4 to apply for a departmental travel grant • Advice prohibits option 2

  26. Example (cont.) • CALO builds optimization problem and solves it • Problem constructed and solution method employed both depend on agent’s nature • E.g. ignore % of intention completed • No more than 10ms to solve • Finds best is tie between options 3 and 4 • Agent’s strategy (based on user guidance) is to consult user over which to do • User instructs CALO to do both options • New mental state S' = (B', {c1,c2,c3,c4}, {g1,g'2,g3,g4}, {i1,i'2})

  27. Overview • CALO: a learning cognitive assistant • User delegation of tasks to CALO • Delegative BDI agent framework • Goal adoption and commitments • Summary and research issues

  28. Summary • CALO acts as user’s intelligent assistant • Classical BDI framework inadequate • Implemented BDI systems lack formal grounding • Proposed delegative BDI agent framework • Separate Desires and Goals • Separate Candidate and Adopted Goals • Incorporate user guidance and preferences • Combined commitment deliberation for goal adoption and intention reconsideration • Enables reasoning necessary for an agent such as CALO • Implemented by extending SPARK agent framework

  29. Related Work • BOID framework [Broersen et al] • Different types of agents based on B/D/G/I conflict resolution strategies • BDGICTL logic [Dastani et al] • Merging desires into goals • Intention reconsideration [Schut et al] • Collaborative problem solving [Leveque and Cohen; Allen and Ferguson] • Social norms and obligations [Dignum et al]

  30. Future Work • Extend goal reasoning to consider resource feasibility (in progress) • Proactive goal anticipation and adoption • Collaborative human-CALO problem solving • Beyond (merely) completing user-delegated tasks • Multi-CALO coordination and teamwork • Learning as part of CALO’s extended life-cycle • More information: http://calo.sri.com/

More Related