1 / 18

Sensor Grid A Workbench for Real-Time Sensor-Data Processing

Sensor Grid A Workbench for Real-Time Sensor-Data Processing. Munehiro Fukuda, Ph.D. Distributed Systems Laboratory, Computing & Software Systems University of Washington Bothell. Parallel Computing Opportunities. Computation-intensive opportunities using live data

idavis
Download Presentation

Sensor Grid A Workbench for Real-Time Sensor-Data Processing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Sensor GridA Workbench for Real-Time Sensor-Data Processing Munehiro Fukuda, Ph.D. Distributed Systems Laboratory, Computing & Software Systems University of Washington Bothell Distr Sys Lab CSS UW Bothell

  2. Parallel Computing Opportunities • Computation-intensive opportunities using live data • Agriculture: Frost protection Q: How would you predict air temperature at each point of farms for the next 4, 8, and 12 hours? A: Air temperature interpolation and prediction • Driving-time estimation Q: How would you accurately estimate your driving time when caught in a traffic jam? A: Real-time traffic simulation • Content-based video search Q: How would you instantly find the best video that explains to your kids about “solar system formation”, “mars terra-forming”, and “fetus development”? A: Parallel video-content analysis Distr Sys Lab CSS UW Bothell

  3. Hurdles to Parallelization • From users’ perspective: • Yet far reaches of computing power • Grid/Cloud yet incapable of facilitating tailor-made parallel computation for individual-provided real-time data • Manual set-up between computation (multi-core PCs) and peripheral I/Os (sensors and cells) at an individual level • From developers’ perspective: • Strong allergy to parallel computing • Steep learning curve of the concept and usage regarding multi-processes and threads • Difficulty in enforcing proper synchronization Distr Sys Lab CSS UW Bothell

  4. Motivation • Our target in parallel computing • Data: read-time sensor data, Internet GIS data, multimedia data etc. • Computation: on-the-fly assistance in decision making • Autonomic computing-resource allocation • Locating best computing resources in a decentralized and dynamic fashion • Automating (re-)connection of all (migrating) computing processes • Parallelization tool • Developing a process/thread/synchronization-unaware library for parallel simulation Distr Sys Lab CSS UW Bothell

  5. Architectural Model for On-the-Fly Parallel Computation with Live Data Multi-Agent Spatial Simulation Storage X Y Z wave model water level sensors User 1 Cluster A heat model User 2 Cluster B temperature sensors Field-based resource allocation Distr Sys Lab CSS UW Bothell

  6. Research Contribution • Field-based process dispatch and migration • Bottom-up self-organizing resource advertisement • Top-down self-adapting resource acquirement • Nomadic-process tracking and broken-link recovery • Multi-process, multithreaded library for multi-agent spatial simulation • No awareness of processes, threads, and race conditions • Dynamic load balancing • Dynamic computation scale-up and down • Delivery of sensor-data • Stream-oriented file-based delivery to nomadic computation • Time-based, event-driven, and/or publisher/subscriber-based data delivery Distr Sys Lab CSS UW Bothell

  7. Field-Based Process Dispatch & MigrationSelf-Adapting and Self-Organizing Approaches Applications Top-down self-adapting resource allocation Bottom-up self-organizing computation-resource potential field Shizuoka University UW Bothell Internet Distr Sys Lab CSS UW Bothell

  8. Field-Based Process Dispatch & MigrationProposed Implementation rank 0 1 2 3 Layer 5: Applications (MPI, MASS) Layer 4: Sentinel Agents sentinel agent migration Layer 3: Computing-Resource Potential Field potential field agent Layer 2: UWAgents daemon daemon Layer 1: TCP-link-assisted UDP-broadcast space vlink vlink tcplink/udp_relay node tcplink/udp_relay bricks tcplink/udp_relay node vlink Layer 0: Hardware computing node network segment A network segment B network segment C Distr Sys Lab CSS UW Bothell

  9. MASS: Library for Multi-Agent Spatial SimulationExecution Model Distr Sys Lab CSS UW Bothell

  10. MASS: Library for Multi-Agent Spatial SimulationProgramming Style 1. import MASS.*; // Library for Multi-Agent Spatial Simulation 2. import java.util.Vector; // for Vector 3. 4.// Simulation Scenario 5. public class RandomWalk { 6. public static void main( String[] args ) { 7. // validate teh arguments 8. int size = Integer.parseInt( args[0] ); 9. int nCars = Integer.parseInt( args[1] ); 10. int maxTime = Integer.parseInt( args[2] ); 11. 12.// start MASS 13. MASS.init( args ); 14. 15.// create a roads array. 16. Places streets = new Places( 1, ”MeshedStreets", null, size, size ); 17.// populate Nomda agents on the land. 18. Agents cars = new Agents( 2, ”Car", null, land, nCars ); 19. 20.// define the four neighbors of each cell 21. Vector<int> neighbors = new Vector<int>( ); 22. int[] north = { 0, -1 }; neighbors.add( north ); 23. int[] east = { 1, 0 }; neighbors.add( east ); 24. int[] south = { 0, 1 }; neighbors.add( south ); 25. int[] west = { -1, 0 }; neighbors.add( west ); 26. 27.// now go into a cyclic simulation 28. for ( int time = 0; time < maxTime; time++ ) { 29.// exchange #cars with four neighbors 30. streets.exchangeAll( 1, MeshedStreets.exchange, neighbors ); 31. streets.callAll( MeshedStreets.update ); 32. 33.// move cars to a neighbor if space is availabe 34. cars.callAll( Car.decideNewPosition ); 35. cars.manageAll( ); 36. } 37. 38.// finish MASS 39. MASS.finish( ); 40. } 41. } Distr Sys Lab CSS UW Bothell

  11. Retrieving Data from Sensor Network • Three strategies to retrieve sensor data: • Publisher/subscriber broker (push type) • Implementing a broker that calls back to Corba/RMI clients upon an occurrence of sensor events registered • Examples: VikingX and Aginova WiFi sensors, both with UDP messaging feature. • Web server (pull type) • Using a built-in web server in a sink node or installing a vendor-provided web portal or server. • Examples: Aginova web portal and ConceptOne’s point manager • Database queries (pull type) • Installing tinyOS on each sensor and tinyDB on a sink node • Sending queries to retrieve sensor data that satisfy a user’s interests. Distr Sys Lab CSS UW Bothell

  12. rank 0 rank 0 1 1 2 2 3 3 Connectors: Supplying Data to ComputationStrategy • Visualizing external data as files, (i.e., Input/OutputStream in Java) local to computation • Filtering out only data of interest • Redirecting data transfer to nomadic computation. Distr Sys Lab CSS UW Bothell

  13. Connectors: Supplying Data to ComputationProposed Implementation • Conversion tools are necessary: publisher/subscriber brokers, HTTP/FTP/Corba/RMI clients specialized in generating file outputs, and a rank0 tracker for forwarding stdin/stdout/X. publisher/subscriber broker Corba/RMI client SSH Tunnel SNMT FTP Server FTP Client WiFi sensor network HTTP client File Storage File SSH Tunnel SSH Tunnel Access point HTTP server rank 0 1 2 3 cluster File SSH Tunnel stdin, stdout, and X 418/900MHz sensor network HTTP client SSH Tunnel Rank0 tracker to forward: stdin, stdout, and X User Sink point HTTP server Distr Sys Lab CSS UW Bothell

  14. Application 1Temperature Analysis Step 1 (estimating current spatial state): Interpolating temperatures with - inverse distance weighting (http://en.wikipedia.org/wiki/Inverse_distance_weighting) - polynomial regression (http://www.kdkeys.net/) temperature sensor network Sink point Step 2 (predicting future spatial state): Simulating heat propagation with heat equation (Dr. Terrel: http://www.math.cornell.edu/~bterrell/free.html) or Predicting temperature with artificial neural network (Smith, McClendon, Hoogenboom: Improving Air Temperature Prediction with Artificial Neural Networks, Int’l Journal of Computational Intelligence Vol3 No3) Distr Sys Lab CSS UW Bothell

  15. goal start goal 45 minutes start Application 2Traffic Simulation Step 1 (retrieving current traffic state): Initializing a traffic simulation with web-cam/real-time GIS data as well as receiving a travel itinerary DB servers Webcam netwrok HTTP server User Step 2 (estimating travel time): Calculating accurate travel time from traffic simulation (Multi-Agent Transport Simulation Toolkit, http://matsim.org/) Note: these two graphics are snapshots of MATSIM execution. Distr Sys Lab CSS UW Bothell

  16. Related Work • Model-based sensor-data processing systems • MauveDB (MIT) • BBQ (U. Maryland, CMU, MIT, Intel, UCB) on top of TinyDB/TinyOS (UCB) • Our focus: accelerating model-based sensor-data processing with parallel simulation • Lego-like distributed Internet GIS • OpenGIS specifications • Our focus: Making all live data visible through TCP connections • Field-based process dispatch and migration • Distributed job scheduling on computational grid using multiple simultaneous requests (Ohio State) • Messor multi-agent-based grid scheduler (U. Bologna Italy) • Our focus: combining the top-down self-adaptive and the bottom-up self-organized approaches • Distributed Array/Multi-Agent Library • HPF distributed array (Delft U Netherland. INRIA France) • Our focus: no awareness of parallel-programming constructs Distr Sys Lab CSS UW Bothell

  17. Plan Distr Sys Lab CSS UW Bothell

  18. Keys to Success • Targeting sensor networks in practical use • Collaboration with Mr. Eliot, Valhalla-Wireless and Mr. Doornink, WA State Tree Fruit Research Commission • Receiving proper and timely inputs from sensor-network specialists • Collaboration with Dr. Watanabe’s laboratory at Shiuzoka University • Getting funded • NSF core (e.g., CNS, CCF and IIS) or cloud-related programs Distr Sys Lab CSS UW Bothell

More Related