1 / 34

Space Applications for Distributed Constraint Reasoning

Space Applications for Distributed Constraint Reasoning. Brad Clement, Tony Barrett Artificial Intelligence Group Jet Propulsion Laboratory California Institute of Technology brad.clement@jpl.nasa.gov http://ai.jpl.nasa.gov/. Outline. Applications multi-spacecraft missions

Download Presentation

Space Applications for Distributed Constraint Reasoning

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Space Applications forDistributed Constraint Reasoning Brad Clement, Tony Barrett Artificial Intelligence GroupJet Propulsion Laboratory California Institute of Technology brad.clement@jpl.nasa.gov http://ai.jpl.nasa.gov/

  2. Outline • Applications • multi-spacecraft missions • “collaborative” mission planning • network scheduling • Current approaches • Challenges for DCR • Unsolicited opinions

  3. Multi-Robot Control • Goal selection • Future commanding • Commanding now • mode estimation/diagnosis • Perception & actuation

  4. Control of/by Humans • Goal selection • Future commanding • Commanding now • mode estimation/diagnosis • Perception & actuation

  5. Analyst Planner Executive Control Distributed Constrained Optimization Optimize a function of variable assignments with both local and non-local constraints.

  6. Space Applications Decentralize decision-making? • competing objectives (self-interest) • control is already distributed • communication constraints/costs • computation constraints • multiple rovers • spacecraft constellation • Earth orbiters • Mars network • DSN antenna allocation • mission planning • construction, repair • crew operations

  7. Applications – Multiple Spacecraft Origins Program Over 40 multi-spacecraft missions proposed! • Autonomous single spacecraft missions have not yet reached maturity. • How can we cost-effectively manage multiple spacecraft? NMP Earth Observing System Sun-Earth Connections NMP Structure & Evolution of the Universe Mars Network

  8. Signals from Celestial Sphere y  x Signals from Magnetosphere  t Applications – Multiple SpacecraftClassification of Phenomena(Underlying Scientific Questions) Five Classification Metrics • Signal Location • Where are the signals? • Signal Isolation • How close are distinct signals in phenomenon? • Information Integrity • How much noise is inherent in each signal? • Information Rate • How fast do the signals change? • Information Predictability • How predictable is the phenomenon?

  9. Isolation & Integrity Rate & Predictability High High Noise Rate SingleSpacecraft SingleSpacecraft Low Low Low High Low High Resolution Need Predictability Applications – Multiple SpacecraftMultiple Platform Mission Types Signal Separation Signal Space Coverage Signal Combination

  10. Cross-links GN&C GN&C GN&C GN&C GN&C GN&C GN&C GN&C GN&C Executive Executive Executive Executive Executive Executive Executive Planner Planner Planner Planner Planner Planner Planner Analyst Analyst Analyst Analyst Analyst Analyst Analyst Space Applications – ScienceHow to Distribute? Who gets which components?

  11. GN&C GN&C GN&C Executive Executive Executive Planner Analyst Autonomous Signal Separation • Why many executives? • Each spacecraft can have local anomalies. • During an anomaly communications can be lost due to drift. • Why only one planner? • During normal operations the spacecraft are guaranteed to be able to communicate. • Since spacecraft join to make an observation, only one analyst is needed.

  12. Autonomous Signal Space Coverage • Why many planners? • Cross-link is lost during normal operations, but spacecraft still have to manage local activities and respond to science events. • Why communicate at all? • The value of local measurements is enhanced when combined with data from others. Analysts must coordinate over collection. GN&C GN&C GN&C Executive Executive Executive Planner Planner Planner Analyst Analyst Analyst

  13. GN&C GN&C GN&C Executive Executive Executive Planner Planner Planner Analyst Analyst Analyst Autonomous Signal/Mission Combination • How does this differ from signal space coverage? • Each entity has different capabilities • Sensors: radar, optical, IR... • Mobility: satellite, rover... • Communications abilities. • Each mission has its own motivations. • There is a competition where each mission wants to optimize its own objectives in isolation.

  14. Commands (L-Band) { AFSCN } Payload Data Telemetry, QL Payload (S-Band) (X-band) Spacecraft GS (i.e., RSC) Payload GS (i.e., Datalynx, USN) Payload Downlink Requests Payload Data - FTP - Overnight (all) PTF Activity schedules Telemetry (ftp) Pager TS-21 Engr R/T MOC MOC ASPEN SCL Matlab Simulation Env SCL TT&C W/S Commanding SOH display Telemetry Pass Playback SOH display Trending Anom Res Payload Ops W/S Engineering Models Cmd Verification Mission Planning Data Center TT&C W/S TT&C W/S rescheduled activities Fight Dynamics newactivities rejectedactivities confirmation PPC Cluster Cmd Verification scheduleupdates MPW removedactivities local constraints Decentralize decision-making? • competing objectives (self-interest) • control is already distributed Space Applications – Mission Operations • communication constraints/costs • computation constraints Techsat-21 • multiple instruments on spacecraft contend for resources • multiple scientists may compete for one instrument (HST) • scientists work with operations staff to make sure goals can be safely achieved • plans must be validated (carefully simulated) • changes made by users in parallel invalidate validation

  15. Applications - Deep Space Network (DSN)

  16. Applications - Deep Space Network (DSN) Decentralize decision-making? • competing objectives (self-interest) • control is already distributed • communication constraints/costs • computation constraints • 56 missions • 12 antennas • different capabilities • shared equipment • geometric constraints • human operator constraints • some schedule as long as 10 years into future • some require schedule freeze 6 months out • complicated requirements originally from agreement with NASA with flexibility in antennas, timing, numbers of tracks, gaps, etc. • schedule centrally generated, meetings and horse trading to resolve conflicts • similar to coordination operations across missions

  17. Applications – DSN Arrays Decentralize decision-making? • competing objectives (self-interest) • control is already distributed • communication constraints/costs • computation constraints • NASA may build 3600 10m weather-sensitive antennas • 1200 at each complex in groups of 100 spread over wide area • High automation requested—one operator for 100 or 1200 antennas • Spacecraft may use any number of antennas for varying QoS, and may need link carried across complexes • Only some subsets of antenna signals can be combined • depends on design of wiring/switching to combiners • combiners may be limited • Local response time should be minimized

  18. Mars Network • Network traffic scheduled far in advance • Windows of comm availability • Need to react to unexpected events and reschedule • Missions must control own spacecraft • Comm affects resources that are needed for other operations • Continual negotiation MGS MEX Odyssey MER B MER A

  19. How does DCR fit? • Goal selection • task allocation • Future commanding • meeting scheduling • Commanding now • mode estimation/diagnosis • Perception & actuation

  20. Distributed Constraint Reasoningfor Planning & Scheduling • Allocating events/resources to time slots (meeting scheduling) • Hannebauer and Mueller, AAMAS 2001 • Maheswaran et al., AAMAS 2004 • Modi & Veloso, AAMAS 2005 • Coordinating plans by making coordination decisions variables • Cox et al., AAMAS 2005

  21. = = Planner Planner Planner = Executive Executive Executive Shared Activity Coordination Shared activities implement team plans, joint actions, and shared states/resources

  22. Shared Activity Coordination(SHAC, Clement & Barrett, 2003) • continual coordination algorithm • language for coordinating planning agents • framework for defining and implementing automated interactions between planning agents (a.k.a. coordination protocols/algorithms) • software • planner-independent interface • protocol class hierarchy • testbed for evaluating protocols

  23. Shared Activity Model • parameters(string, integer, etc.) • constraints (e.g. agent4 allows start_time [0,20], [40,50]) • decompositions(shared subplans) • permissions- to modify parameters, move, add, delete, choose decomposition, constrain • roles- maps each agent to a local activity • protocols- defined for each role • change constraints • change permissions • change roles • includes adding/removing agents assigned to activity

  24. Control Protocols for a Shared Activity • Chaos • A free-for-all among planners • Master/Slave • The master has permissions, slaves don’t • Round Robin • Master role passes round-robin among planners • Asynchronous Weak Commitment (AWC) • Neediest planner becomes master • Variations • how many planners share activity • use of constraints

  25. Asynchronous Weak Commitment AWC::modifyPermissions() • if have highest priority • remove self’s modification permissions (add, move, delete) • else • give self modification permissions AWC::modifyConstraints() • if cannot resolve local conflicts and conflicts with constraints of higher ranking agents • set own rank to highest rank plus one • generate parameter constraints (no-good) describing locally consistent values

  26. Experiments – Abstract Problem • joint measurements • capability matching • 3-9 spacecraft • 1-7 capabilities • 1-9 joint goals each requiring 1-4 of each capability

  27. Experimental Results(Progress over cpu time) AWC Chaos - invalid solutions M/S - not complete RR Chaos M/S number of problems max cpu time (seconds)

  28. Computing Consensus Windows 1 1 1 1 Agent C Agent C Agent A Agent B Agent A Agent B 2 2 highest rank decides voting or auction execute time Agent A Agent B Agent C consensus window

  29. execute time Agent A Agent B Agent C consensus window Computing Consensus Windows 1 1 Agent C Agent A Agent B 2 2 voting or auction

  30. Computing Consensus Windows 1 1 Agent C Agent A Agent B 2 2 voting or auction execute time Agent A Agent B votescollected Agent C consensus window

  31. Computing Consensus Windows 1 1 Agent C Agent A Agent B 2 2 voting or auction execute time Agent A Agent B Agent C

  32. Computing Consensus Windows 1 1 Agent C Agent A Agent B 2 2 voting or auction execute time Agent A Agent B Agent C consensus window

  33. Causal Inconsistency Order of events • a is master and shares with (adds to roles) b • b receives add from a • a replaces b with c and makes c master • c receives add message making it master • c makes b master and removes self (deletes) • b receives add/master from c (before delete from a) • a receives update from c • b receives delete from a 2 add b 1 A SHAC protocol is proven sound if • the underlying planners are sound, • the protocol ensures that only one agent has permissions over any piece of information, and • it employs causally consistent communication 8 a delete 6 3 3 7 add/master add/master 4 5 update 5 c

  34. Summary • Many space applications for distributed constraint reasoning • Many involve model-based causal systems • Need to map these systems to DCRs • how are CSPs mapped? • Need to handle • continuous variables (including cpu) • limited computation • not 1000 computers, but 2-10 • communication outages, unreliability, guarantees

More Related