1 / 9

Wednesday, August 4, 1999 William H. Hsu, Loretta Auvil, Tom Redman, Michael Welge

Neural, Bayesian, and Evolutionary Systems for High-Performance Computational Knowledge Management: Demonstrations. Wednesday, August 4, 1999 William H. Hsu, Loretta Auvil, Tom Redman, Michael Welge Michael Bach, Peter L. Johnson, Mike Perry, Kristopher Wuollett Automated Learning Group

Download Presentation

Wednesday, August 4, 1999 William H. Hsu, Loretta Auvil, Tom Redman, Michael Welge

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Neural, Bayesian, and Evolutionary Systemsfor High-Performance Computational Knowledge Management: Demonstrations Wednesday, August 4, 1999 William H. Hsu, Loretta Auvil, Tom Redman, Michael Welge Michael Bach, Peter L. Johnson, Mike Perry, Kristopher Wuollett Automated Learning Group National Center for Supercomputing Applications http://www.ncsa.uiuc.edu/STI/ALG

  2. Overview: Tools for Dealingwith Multisensor T&E Data • Short-Term Objectives: Building a Data Model • Progress to date: data channel typing for ontology • Current work: CGI form for data channel grouping, selection • Future work: integrity-checking, data preparation modules • Longer-Term Objectives • Multimodal Sensor Integration: multiple models in data fusion itinerary • Relevance Determination: genetic algorithm wrapper (current work) • Causal (Explanatory) Models: Bayesian network based on ontology • Test Bed: Super ADOCS Data Format (SDF) • 1719-channel asynchronous data bus (General Dynamics) • Experiment/Data Design • Typing: interactive tool for constructing data model • Specification of prediction target based on caution/warning channels • Interactive specification tool for learning architectures, algorithms • Target end users: test/instrumentation report designers, implementors • Analytical Applications: Decision Support

  3. Super ADOCS Data Format (SDF)Data Conversion and Selection Interface • CGI (Perl-based) form: Apache, MS Internet Explorer 5

  4. An Ontology for T&E Data • Application Testbed • Aberdeen Test Center: M1 Abrams main battle tank (SEP data, SDF) • Generic Data Model (Facility for Experiment Specification) • T&E Information Systems: Common Characteristics • Large-Scale Data Model • Objective: develop system capable of reducing model complexity • Methodology: build a relational (taxonomic, definitional) model of data • Data Integrity Requirements • Interactive form-based specification of test objective • Specification of error metrics, visualization criteria • Multimodality • Selection of relevant data channels • Interactive, support for automation • Data Reduction Requirements • Non-uniform downsampling - requires database of engineering units • Irrelevant data channels - requires type hierarchy

  5. Caution/Warning Profilometer Fuel Systems Timing Spatial/GPS/ Navigation Hydraulics Data Bus/Control/ Diagnostics Ballistics Electrical Unused SDF Ontology:Data Channel Types

  6. Intranet Operating Environment • Database Access • SDF import, flat file export • Internal data model: interaction with learning modules • Future development: SQL/Oracle 8 (JDBC) interface • Deployment • CGI, JavaScript stand-alone applications • Management of modules, data flow through forms • Presentation: Web-Based Interface • Simple, HTML-based invocation system • Common Gateway Interface (CGI) and Perl • Alternative implementation: servlets (http://www.javasoft.com) • Configuration of data model (file generation) • Management of experiments • Construction of models • Specification of learning systems (model architecture, training algorithm) • Messaging Systems (Deployment  Presentation)

  7. Super ADOCS Data Format (SDF)Experiment Design Interface • D2K Genetic “Wrapper” for Data Channel Selection

  8. Time Series Analysis and Visualization:System Prototype • Visible Decisions Inc. (VDI) In3D

  9. Environment (Data Model) Knowledge Base Time Series Analysis/Prediction Learning Element Summary and Conclusion • Model Identification • Queries: test/instrumentation reports • Specification of data model • Grouping of data channels by type • Prediction Objective Identification • Specification of test objective • Identification of metrics • Reduction • Refinement of data model • Selection of relevant data channels (given prediction objective) • Synthesis: New Data Channels • Integration: Multiple Time Series Data Sources

More Related