1 / 16

SENSOR FUSION LABORATORY

MISSION: Study the benefits of using simultaneous information from multiple sensors to probe the environment. SENSOR FUSION LABORATORY. Thad Roppel, Associate Professor AU Electrical and Computer Engineering Dept. troppel@eng.auburn.edu. EXAMPLES

denzel
Download Presentation

SENSOR FUSION LABORATORY

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. MISSION: Study the benefits of using simultaneous information from multiple sensors to probe the environment. SENSOR FUSION LABORATORY Thad Roppel, Associate Professor AU Electrical and Computer Engineering Dept. troppel@eng.auburn.edu • EXAMPLES • Infrared / Millimeter wave radar for vehicle detection and identification • Chemical sensor arrays – “artificial nose” • Biomimetics – imitating animal sensorimotor behaviors • Biomedical – using electrical and optical probes to study cardiac arrhythmias

  2. Problem Complexity: Human vs. Machine • Object recognition • Linguistics • Extraction of Relevant Features from Sensor Arrays • Judging HARD Maximum Potential Benefit MACHINE • Thresholding • Tallying • Arithmetic • Logic EASY HARD EASY HUMAN SENSOR FUSION LABORATORY

  3. Personnel and Publications • PERSONNEL • Ting-To Lo (PhD): Molecular Switching in Biosensors • Rama Narendran (PhD): Biomimetic Simulations of Organized Machine Behavior • Jun Pan (PhD): Wireless Protocol for Electrical and Optical Cardiac Microprobes • Aroldo Couto (MS): Flight Stabilization Using Adaptive Artificial Neural Networks • Brian Wingfield (MS): Silicon Processing for Lateral Emission Fiber-Optic Sensors • REPRESENTATIVE RECENT PUBLICATIONS • D. M. Wilson, T. Roppel, and R. Kalim, "Aggregation of Sensory Input for Robust Performance in Chemical Sensing Microsystems," Sensors and Actuators B, 64(1–3), 107-117, June 2000. • T. Roppel and D. M. Wilson, "Biologically-Inspired Pattern Recognition for Odor Detection," Pattern Recognition Letters, 21(3), 213–219, March 2000. • D. M. Wilson, K. Dunman, T. Roppel, and R. Kalim, "Rank Extraction in Tin-Oxide Sensor Arrays," Sensors and Actuators B, 62(3), 199-210, April 2000. • T. Roppel, R. Kalim, and D. Wilson, "Sensory Plane Analog-VLSI for Interfacing Sensor Arrays to Neural Networks, " Virtual Intelligence and Dynamic Neural Networks VI-DYNN '98, Stockholm, Sweden, June 22-26, 1998.

  4. IR / MMW DATA FUSION Support: AFOSR 1992-93 Project Goal: Improved identification of military vehicles from aerial scenes. T-62 Tank M-113 Armored Personnel Carrier (APC) LANCE Missile Launcher

  5. APPROACH: IR SCENE PIXELS APC NEURAL NETWORK TANK LAUNCHER MMW RADAR DATA IR / MMW Fusion, cont’d OVERALL RESULT: 14 % improvement with sensor fusion PERFORMANCE ASSESSMENT: • Multiple permutations • Confusion matrix • Average result

  6. COMMAND STATION PLUME RF LINK VEHICLE WIND SENSORS ROAD Chemical Sensor Arrays Support: DARPA 1997-99 PROJECT GOAL: Improved identification and detection of chemical plumes in non-laboratory conditions.

  7. Canine Training at IBDS Auburn is world-renowned for training of detection dogs at the Institute for Biological Detection Systems.

  8. 5 4.5 4 3.5 3 Sensor Voltage 2.5 2 1.5 1 0.5 0 0 100 200 300 400 500 Timestep Odor Sensor Array Sensor Outputs Chemical Sensor Arrays, cont’d Sensor Array Dynamic Response

  9. Below Threshold Above Threshold 5 4.5 2 4 4 3.5 6 3 Sensor Voltage Sensor Number 2.5 8 2 10 1.5 Sensors 1-15 12 1 0.5 14 0 0 100 200 300 400 500 10 20 30 40 50 Timestep Timestep Thresholded Binary Output Raw Output Chemical Sensor Arrays, cont’d Preprocessing

  10. Sample 1 Sample 2 Sample 3 1 1 ace 20 20 amm dal g87 g89 g93 oil pth xyl 5 10 15 5 10 15 5 10 15 Sensor # Sensor # Sensor # Chemical Sensor Arrays, cont’d

  11. Time Evolution of Confusion Matrix: Forward Sequence Trained for 20 timesteps 1 timestep 5 timesteps 10 timesteps ace amm 1 dal network response g87 0.9 g89 0.8 g93 oil 0.7 pth 0.6 xyl 0.5 input categories 0.4 0.3 20 timesteps 50 timesteps Ideal Response 0.2 ace 0.1 amm dal network response 0 g87 g89 g93 oil pth xyl Chemical Sensor Arrays, cont’d

  12. Time Evolution of Confusion Matrix: Random Sequence Trained for 20 timesteps 1 timestep 5 timesteps 10 timesteps ace amm 1 dal network response g87 0.9 g89 0.8 g93 0.7 oil pth 0.6 xyl 0.5 input categories 0.4 20 timesteps 50 timesteps Ideal Response 0.3 ace 0.2 amm 0.1 dal network response 0 g87 g89 g93 oil pth xyl Chemical Sensor Arrays, cont’d

  13. Chemical Sensor Arrays - Summary • A recurrent neural network was trained to recognize 9 odors presented in an arbitrary time sequence. • Response time is reduced by an order of magnitude by threshold preprocessing. • Well-suited for use as a front-end for a hierarchical suite of NN’s in a portable, near-real time odor classification device.

  14. BIOMIMETICS Support: Under discussion with AF Advanced Guidance Division, Munitions Directorate at Eglin AFB PROJECT GOAL: Learn sensor fusion from animals. Apply this to flying a drone to target using onboard video. Flies land accurately Bats catch evading insects in flight Bees find flowers

  15. BIOMIMETICS, cont’d What do they “know” that we don’t? One possibility is that they use variations of optic flow. Represent sensory image field by motion vector field. Image Sequence Optic Flow Field

  16. BIOMIMETICS, cont’d EXAMPLES A fly can land simply by maintaining constant optic flow. A dog can track by maintaining constant sensory flow across olfactory epithelium and following the gradient (using sniffing as a form of “chopper amplifier.” Questions to be answered: Can we guide a missile to target, orchestrate complex defense systems, identify faces in a crowd, or track contaminated food with similar approaches? END OF PRESENTATION

More Related