1 / 25

The Challenge of Grounding Planning in Simulation with an Interactive Model Development Environment

The Challenge of Grounding Planning in Simulation with an Interactive Model Development Environment. B. Clement, J. Frank, J. Chachere, T. Smith and K. Swanson. Outline. Motivation Example Concept of Operations IMDE Architecture Demonstration Technology Foundations Challenges.

elke
Download Presentation

The Challenge of Grounding Planning in Simulation with an Interactive Model Development Environment

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Challenge of Grounding Planning in Simulation with an Interactive Model Development Environment B. Clement, J. Frank, J. Chachere, T. Smith and K. Swanson KEPS 2011

  2. Outline • Motivation • Example • Concept of Operations • IMDE Architecture • Demonstration • Technology Foundations • Challenges KEPS 2011

  3. The Problem • Model-based planning: a good thing • Describe planning problem using a declarative modeling language (e.g. PDDL) • Single code base can solve many types of planning problems described using this language • Model-based planning successfully used in many applications including space missions • Modeling and model validation is a problem • Source of model is often complex • Source of model often at finer granularity than needed for planner model • Modeling languages unable to represent ‘real’ constraints • Full test coverage is often impossible KEPS 2011

  4. The Problem • Our specific focus: • Validation of model against a simulation • Simulation: • Command and data description available • Documentation describing expected behavior • Otherwise, a ‘Black Box’ • Model: • To be built to configure automated planner • Person building model has access to simulation • Common scenario in space applications KEPS 2011

  5. Example Z X Y • Suppose we are developing an automated planning system for a spacecraft. • How would we model a slew? • A slew is a change in attitude. • Attitude is orientation, say angles <x, y, z> around the x, y, and z axes. • We want to keep track of pointing constraints for solar power, communication to Earth, and science measurements to a target. KEPS 2011

  6. Example Z X Y • ACS • 4 reaction wheels • only 3 used • Which ones depends on slew details • Reaction wheel commands • Start, change speed, off • Reaction wheel data • Rpm, direction, on, off • Startracker • Sun position • EPS • Subsystems powered • Battery state of charge • Voltage on channel • CPU • On, Off • IMU • On, off • X,Y,Z attitude • X,Y,Z rate of change KEPS 2011

  7. Example Z X Y • Issues: • How does the abstract slew activity map to sequences of spacecraft commands? • How long does a slew take? • How do we know when continuous <x, y, z> is pointing to discrete ‘?from, ?to’ ? • Validating the planner model • Requires ensuring that the constraints/effects in the model conform to the behavior of the simulator KEPS 2011

  8. Example On Comm. Earth >2 00:10 Sun NavStar1 (:durative-action slew :parameters (?from – attitude ?to - attitude) :duration (= ?duration 5) :condition (and (at start (pointing ?from)) (at start (>= (sunangle) 20.0)) (over all (>= (sunangle) 20.0)) (at start (communicating)) (over all (communicating)) (at start (>= (batterycharge) 2.0))) :effect (and (at start (decrease (batterycharge) 2.0)) (at start (not (pointing ?from))) (at end (pointing ?to)))) >20 KEPS 2011

  9. Example • Simulate a plan solution with a single slew to Earth and a goal (pointing Earth). • Construct simulation commands from plan • Construct initial state from plan • Select other initial conditions not part of plan initial state • Simulation output shows no change in attitude. • Must be an error in the model or abstractions because the goal fails. • No other output suggests anything went wrong. • No information to identify error! • Simulate more slews with different initial states. • Some simulations show successful slews, and others show no slew. • The most noticeable pattern is that whenever the simulator cpu-onis zero, the slew is not executed. • Fix the model: • add the cpu-onsimulator state variable to planning model • add a cpu-onprecondition to the slewactivity OR • add a turnOnCpu()simulator command to the refinement OR • add the turnOnCpu()simulator command to the planning model and add a temporal constraint that slew be preceded by turnOnCpu(). KEPS 2011

  10. Concept of Operations Today • figure out how system works from various sources • documentation • people • interaction with simulation testbed • figure out what system commands correspond to activities • write action and state/resource models using text editor based on available descriptions of system • create planning problems and run planner • look at solutions to debug model • if no solution, figure out why • run plans through simulator and figure out if it did what it expected • correct and augment model based on new knowledge • create planning testbed • auto-generate planning problems as test cases • auto-check simulator output for "problems“ KEPS 2011

  11. Concept of Operations Tomorrow • integrate simulation testbed with IMDE • integrate planner with IMDE (if it's not already) • model is developed by using IMDE to • abstract simulator commands into actions for planning, • abstract simulator output into states/resources, and • add constraints and effects to action models. • IMDE identifies errors in model and suggests corrections KEPS 2011

  12. Motivation (redux) • Model checking does not help validate model in system context. • Prior plan verification/validation work • verifies constraints/properties encoded in another language, • ensures against foreseeable problems, • verifies self-consistency, or • validates plan as a solution. • We are interested in checking against system behavior as represented by simulation. • Simulation can be used to check simultaneously for all problems, many of which • may not be expressible in the model checking language or • may be unforeseeable. KEPS 2011

  13. Motivation (redux) • NOT auto-generating planning model from simulation, because • The planning domain modeler does not build the simulation. • The simulation is usually a black-box. • Even if it weren’t it is too difficult to transform the white-box model into our planning representations. • And the simulator can’t be used for up-front planning. (It is not designed to be used this way). KEPS 2011

  14. Integrated Model Development Environment • The IMDE will bridge the gap between planning and simulation • Interfaces to simulation • Present simulation command and data API to planning model developer during model editing. • Automate generation of plans and their simulations. • Automate translation of simulation output into planning model states. • Automate detection of errors in model based on inconsistencies between the plan and simulation. • Automate identification of possible causes of errors from multiple runs. • Automate suggested fixes to model KEPS 2011

  15. IMDE Architecture Simulator API Browser data commands AbstractionEditor • Abstraction Editor • Store relationship between plan and simulation entities • Model Editor • Integrated tightly with simulation • Refinement Engine • Translations between plans and simulations; employs stored abstractions • Abstraction Engine • Translates simulation traces to ‘as-executed’ plans • Validator • Checks ‘as-executed plans’ against model • Fixer • Proposes fixes to model abstractions Model Editor model Planner plan RefinementEngine commands Simulator statetimelines Abstraction Engine simulatedexecution Validator errors suggestions Fixer KEPS 2011

  16. IMDE Architecture:Abstraction Editor • Document how model (actions, states, parameters, values) relate to simulation command and data specification • Provides human-readable traceability between systems • Formal representation of abstractions is input to Refinement/Abstraction engine • Examples: • Command abstraction: slew -> command sequence • Data abstraction: reaction wheel rotation rates -> slewing KEPS 2011

  17. IMDE Architecture:Refinement / Abstraction Engine • Refinement: • Transforms plan and initial state into simulation input and command sequence • Transformation defined by ‘inverting’ abstractions created in Abstraction Editor • Abstraction: • Transforms simulation output into ‘predicted’ plan states • Transformation uses abstractions created in Abstraction Editor • Examples: • Translate discrete pointing attitudes to <x,y,z> targets when setting up simulation run from output plan • Translate time-series of <x,y,z> outputs to discrete positions to determine whether plan achieved desired pointing state KEPS 2011

  18. IMDE Architecture:Validator • Compare states as generated by planner to those abstracted from simulation • Differences indicative of a discrepancy between plan model and simulation behavior • Can also report simulator errors, constraint violations not caught by simulator, etc. • Examples: • End pointing state achieved late • End pointing state not achieved KEPS 2011

  19. IMDE Architecture:Fixer • Given a discrepancy between planner model and simulator • Identify part of model that is to blame • Suggest changes to model that will address the problem • Examples: • Simulation reports error at beginning of slew because CPU not turned on • Discrepancies in slew duration point to path-dependent slew duration KEPS 2011

  20. Validating the Model • Instead of testing all possible plans, start with a single model element. • For example, simulate slew for different initial states and parameterizations. • Then simulate all two-activity plans (slew and another activity) with different temporal relations. • Continue while growing numbers of activities. • May be able to avoid plans like turnOnCpu turnOnCpu  slew . . . • Likelihood of finding error decreases and confidence in model increases. • Is there a point where we can stop and claim slew is correct/valid?No, but we may with assumptions of causal independence. • If each action model is valid, the entire model is valid! • Validate model as it is built from scratch! KEPS 2011

  21. IMDE Architecture Simulator API Browser data commands AbstractionEditor • Abstraction Editor • Java functions (declarations to be auto-generated from drag & drop associations) • Model Editor • simple Eclipse text editor • Refinement Engine • invokes Java functions using reflection • Abstraction Engine • invokes Java functions using reflection • uses EUROPA planner to reconstruct plan execution from simulator output (plan recognition or state estimation) • Validator • EUROPA reports planning constraint violations of simulated execution to indicate model errors • Fixer • not yet implemented • need to auto-generate plan test cases • need to learn properties of plans whose simulations indicate modeling errors. abstractions Model Editor model Planner plan RefinementEngine commands Simulator statetimelines Abstraction Engine simulatedexecution Validator errors suggestions Fixer KEPS 2011

  22. IMDE Model/Abstraction Editor • Explore • sim API • model • plan simulations • Multi-select, drag, and drop to create abstractions Gantt view KEPS 2011

  23. Technology Foundations • DAASA • S/C Telemetry abstraction authoring and execution • VAL • Validation of plans against specifications • ItSimple, PRIDE • Integrating modeling and simulation • LOCM • Learning plan models from plan traces • PAGODA • Learning plan models from simulations • SLATE • Validating composite actions/behaviors through testing KEPS 2011

  24. Challenges • How can a complete but tractable space of test case plans be identified for activity model validation? • Can a single test case contribute to the validation of multiple model elements? • Where is the true cause of the error? model? command refinement? data abstraction? simulator? actual system? • What are the features of a learning problem for classifying an error? numbers of actions of each type? temporal ordering? initial state? simulation state at time of error? • How can suggested fixes be generated for these errors? • How can errors be identified and fixed for different modeling language features, such as uncertainty, parameter functions, and decompositions? • How can the architecture be adapted to suggest changes for plan quality? • How can we take advantage of white box simulators? auto-generate refinements to sim commands? auto-fill model? KEPS 2011

  25. Conclusion and Future Work • Validating plan domains against simulations is a challenge • Extending the notion of the IMDE as our solution • Expose sim API in plan model editor • Providing ‘good UI’ • Document relationship between elements (abstractions) • Automate error detection and suggesting fixes • Working on more realistic scenario KEPS 2011

More Related