1 / 12

WSN Summer Project Demo Scenario

WSN Summer Project Demo Scenario. <Draft version 2008-07-14> Skövde 2008. Shows features (legend). Real-time (RT) Fusion (Fus) Database, Replication (DB) Scalability, ViFuR (VF). Fire fighting scenario or Battlefield game?.

Download Presentation

WSN Summer Project Demo Scenario

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. WSN Summer Project Demo Scenario <Draft version 2008-07-14> Skövde 2008

  2. Shows features (legend) • Real-time (RT) • Fusion (Fus) • Database, Replication (DB) • Scalability, ViFuR (VF)

  3. Fire fighting scenario or Battlefield game? • The scenario described can be mapped to alternative real scenarios – ”imagine!”

  4. Scenario features • Items (people/vehicles etc) are moving over a field. They reveal signatures by (light) sensors that are spread out over the field. Their locations are reported into a database, from where their tracks can be extracted and also predicted by using fusion (e.g. Kalman filters). • Items are reported, with signatures and timestmaps, to the connected database node. • Disjoint sensor sub nets report their area to separate database nodes. Tracking and event localization is done at a (or several) fusion node(s), where database replication provides data for the entire area. • Tracks and actions are visualized at multiple client nodes, which have replicas of the visualization data (segment) of the database.

  5. Features of the Scenario • Sensors: Tracking by using features, event detection and localization • Events are localized and uses certainty management, combines multiple sensors (RT, Fus) • Sensors classifies into signatures. This emulates processing for signatures at the sensor (RT, Fus) • Tracks + localized events are visualized on multiple displays (DB, Rep) • Tracking by using the logged visual trace, short term database time series, items may be distinguished by their physical move limitations (DB, Fus) • In the multi-tier architecture, sensors are arranged into subnets, each connected to a (single) database node. Data from a database node, representing sensors in the subnet are replicated to other database nodes as client needs arise. (VF, DB)

  6. Sensor setup • Visual: Multiple TelosB, using the light sensors. • Optional: (Acoustic: Room has 5 MICAz+MTS310, using the microphone sound level)

  7. Sensor placement L L L L L L L L L L L L L L L L L = Light sensor, on a projector screen

  8. Technical needs • Move-events + classification from sensors • Action-events + signature from sensors • Feed of sensor data + (re-)configuration. Support request for 1) - single sensor input 2) - periodic sensor input 3) - sensor input (single or periodic) in response to event (including timer), all sensor input includes a vector of all sensor readings in the sensor node (should we prepare for the possibility of requesting a subset, or adding additional sensors?)(should we prepare for the need to buffer multiple readings and send as single message to save bandwidth clutter?) • Database replication, single master, but extendable for PRiDe

  9. Impl:Mote communication • Tmote Connect approach • [ TelosB gateway  Tablet USB ] • [ Tinymote gateway  Tablet SD slot ] • [ Bluetooth gateway  Tablet (port?) ]

  10. Impl:Database • Store sensor data in local BDB database • Visualize sensor data • Replicate sensor data to other nodes • Visualization sensor data at other DB node

  11. Visualization (1) • Raw indications L L L L L L L L L L L L L L L L = Intensity of light, color for certainty of indication

  12. Visualization (2) 30% • Trace (and prediction) 50% 80% 100% 70% ”certainty%” = Fused localization (intensity of light/certainty of indication) = Prediction (by Kalman filter?)

More Related