1 / 10

MAITA Monitoring, Analysis, and Interpretation Tool Arsenal

MAITA Monitoring, Analysis, and Interpretation Tool Arsenal. Jon Doyle William Long Peter Szolovits Philip Greenspun Christine Tsien. Isaac Kohane Cungen Cao. Clinical Decision Making Group Laboratory for Computer Science Massachusetts Institute of Technology.

elle
Download Presentation

MAITA Monitoring, Analysis, and Interpretation Tool Arsenal

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. MAITAMonitoring, Analysis, and Interpretation Tool Arsenal Jon Doyle William Long Peter Szolovits Philip Greenspun Christine Tsien Isaac Kohane Cungen Cao Clinical Decision Making Group Laboratory for Computer Science Massachusetts Institute of Technology Harvard University Medical School and Childrens’ Hospital (Boston) http://www.medg.lcs.mit.edu/projects/maita

  2. Outline • Participation • Both integration teams on movement analysis • Knowledge acquisition from SMEs • Discussions on battlespace ontology, problem-solving methods, evaluation methods, OKBC, CP refinements • Progress • Monitoring Architecture • Movement Analysis • Plans • Refinement • Integration • Decision Models

  3. Monitor process structure Trend Templates Alerting Models Control & Interpretation Template Display Models Signal Transducers Data Sources Data & Alerts Knowledge bases, databases and state repository KB KB KB KB System Overview • Distributed Monitoring Environment • Multiple developers and operational users • Distributed editing tools and libraries • Multiple knowledge bases and ontologies • Multiple signal and data sources • Dynamic interacting monitor processes • Libraries of problem-solving methods • Signal transducers and trend detectors • Control, alerting, and display models User added monitors and knowledge Rapid distributed construction

  4. Distributed Monitoring Architecture Control Panels Monitors of Monitors (MoMs) Java applets Network of Monitor Processes wired together by terminals Displays Strip charts Maps Java applets HTTP HTTP Sockets Control Terminal Sockets HTTP OKBC ODBC Monitor Process Packets (structured reports) Input Terminals Output Terminals Internal State Subprocesses and terminal connections DBs MoM tables Monitor tables KBs Correlators Transducers Monitor Library Packet Library Alerting Models Monitor Knowledge OKBC ODBC Editors Network Structure, Monitor Descriptions

  5. Movement Analysis • Geometric algorithms for road-finding, etc. • Java and Lisp implementations used by several groups • Suite of elementary MTI signal transducers • Vehicle histories and classifications • Starts, stops, enter road, leave road, military, nonmilitary • Convoy identification, starts, stops, tracking • Classification for certain displacement convoys • Off-road site detection and population monitoring • Radar site identification • Strategy • Online operation and report generation to intel officer • Report partial information first, refine later

  6. Movement Analysis Processes Traffic Trends Convoy Motion Display MTI Tracks Convoy Motions Vehicle Motions Convoy ID Displaced Sites Off-road Sites Displacement Convoy ID ISE Reports Military Vehicles Radar Sites

  7. Successes Analysis at 10-15X real time Good rejection of civilian traffic Good identification of convoys Good identification of sites Reasonable differentiation of military vehicles Plausible identification of certain displacement motions Failures Possible errors in understanding SME parameters Undiagnosed monitoring failures Disappointments Task required extensive low-level 2½D “signal-processing” mechanisms involving little explicit knowledge and little overlap with familiar 1D signals Successes and Failures

  8. Lessons Learned • Efficacy of key facts • Off-road movements • Coordinated movement histories • Importance of evaluation semantics • Common terms in KB language and organization • Reporting at increasing levels of detail • Importance of integration • Integrate early and often • Consensus architecture • Simulation and evaluation tools from day 1 • System administration burden and feasibility of integrator administration • Combining inputs as an HPKB component task

  9. Plans • Refine architecture • Integrate with KB editing environments • Investigate use in multiple domains • HPKB domains (battlespace, information assurance) • Medical (ICU, diabetes, heart disease) • Formalize decision-making knowledge for HPKB alerting tasks

  10. Decision Models for Alerting • Who, when, and how to notify • Aggregating or serializing related alerts • Focusing on most important alerts • Preference representation and reasoning • Model qualitative and quantitative preferences and tradeoffs • Compile qualitative and quantitative information into quantitative utility measures • KB taxonomies for organizing alerting models and their associations with monitoring methods

More Related