1 / 47

A Real-World Test-bed for Mobile Ad hoc Networks:

A Real-World Test-bed for Mobile Ad hoc Networks: Methodology, Experimentations, Simulation and Results. Per Gunningberg, Erik Nordström, Christian Rohner, Oskar Wibling Uppsala University. Background and problem. IETF is standardizing MANET (Mobile Adhoc NETwork) routing protocols:

winka
Download Presentation

A Real-World Test-bed for Mobile Ad hoc Networks:

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A Real-World Test-bed for Mobile Ad hoc Networks: Methodology, Experimentations, Simulation and Results. Per Gunningberg, Erik Nordström, Christian Rohner, Oskar Wibling Uppsala University

  2. Background and problem • IETF is standardizing MANET (Mobile Adhoc NETwork) routing protocols: • One proactive protocol - knowledge about all nodes • One reactive protocol - path on the need basis • Based on experiences from three protocols: • AODV - Adhoc On Demand Distance Vector (reactive) • DSR - Dynamic Source Routing (reactive) • OLSR - OptiMized Link State Routing(proactive) • Problem: But majority of research done through simulations...

  3. Part One • A test-bed for evaluating ad hoc routing protocols. • Close to reality • What to measure and how to analyze • Repeatable experiments • Grey Zone Phenomena • Conclusion

  4. The Uppsala Ad hoc Protocol Evaluation Testbed (APE) • People carrying laptops with 802.11b • Suitable for indoor experiments that are hard to model in simulation

  5. The Ad hoc Protocol Evaluation Testbed (APE) • Execution environment on top of existing OS. • Runs on Win and Linux • Scenarios with movement choreography. • Emphasizes easy management for scaling. • 800++ downloads.

  6. Laptop instructions (choreography) node.11.action.0.msg=Test is starting... node.11.action.0.command=start_spyd node.11.action.0.duration=1 node.11.action.1.command=my_iperf c 2 t 330 node.11.action.1.msg=Stay at this location. node.11.action.1.duration=30 node.11.action.2.msg=Start moving! Go to Point A, the end of building. node.11.action.2.duration=75 node.11.action.3.msg=You should have arrived at Point A. Please stay. node.11.action.3.duration=30

  7. Measurement procedures • Every node collects SNR from every other node it can hear during the test session • Every event is time stamped • Received Packets/Application results are collected at all nodes • Routing state snapshots are collected • Analysis is done after the test session.

  8. Replaying a scenario • SNR mapped to virtual distance • Each time interval corresponds to a topological map Point D 25 50 100 125 150 75 T Point A

  9. APE is a Testbed for… • Relative protocol performance comparisons • Radio channel effects on ad hoc routing protocols • Interactions between hardware, software, protocol, mobility and radio environment Example: Grey Zone Phenomena • Validation of simulation models • Generation of traces

  10. 0 1 2 3 3 802.11 Gray Zone Phenomena Unicast AA Broadcast

  11. Challenge • Results should be reproducible and comparable between tests • It follows that experiments must be repeatable... • ...and therefore stochastic factors need to be dealt with • So – what can we achieve?

  12. Stochastic Factors in Real World Experiments • Node mobility adds frequent changes in the network topology. • We use choreography and “measure topology differences” • Variations in hardware and software configuration. • We use identical hardware and software. • Time varying radio environment affects link quality and error rates.

  13. Topology differences - visual check Experiment 1 Experiment 2 RED= Average mobility GREEN = 25% with lowest mobility BLUE = 25% with highest mobility

  14. Part Two • Evaluating MANET protocols with the APE testbed, simulation and emulation. • Scenarios • UDP, Ping and TCP • Side-by-side comparison • Faulty protocol constructs • Conclusion

  15. Coupling Simulation, Emulation and Real World

  16. Routing protocols ability to adapt • OLSR - Proactive Link state routing. Monitors neighbors and exchange link state info. • AODV - broadcasts to set up path. HELLO or Link feedback to detect link failure. • DSR - broadcasts with source route. Listens to other traffic to find shorter route. RTT measurements and network ACKs. React to connectivity changes

  17. Emulation • Same configuration as Real world • Table-top emulation • MAC filters force connectivity changes • Reduces radio and mobility factors • Interference reduces bandwidth

  18. Simulation • Scenarios recreated in a ns2-simulation using “default” models: • Transmission range tuned to better match indoors • Mobility with jitter modeled after real world measurements • Results averaged over 10 runs • Results provide a baseline • Can simulations using default (simple) models be used to predict routing protocol performance in complex real world environments?

  19. 3x3x3x(10 runs) = 270 runs Multidimensional Comparison • Three MANET routing protocol implementations: • OOLSR, AODV-UU, DSR-UU • Three traffic types: • UDP (20 pkts/s CBR) • Ping (20 pkts/s CBR) • TCP (File transfer) • Three mobility scenarios: • End node swap, Relay node swap, Roaming node • Three environments (dimensions): • Simulation, Emulation, Real world

  20. Experimental Test Environment • Indoors with offices and corridors • Four nodes (0, 1, 2, 3) • Four waypoints (A, B, C, D) • One data stream from node 3 to node 0

  21. A B C D 0 1 2 3 Relay Node swap AA

  22. Scenarios – Relay Node Swap • End nodes stationary • Intermediate nodes changes position • Hop count never smaller than 2

  23. A B C D 0 1 2 3 End node swap AA

  24. Scenarios – End Node Swap • End nodes change positions • Intermediary nodes stationary • Hop count changes from 3 to (2) and 1 and back

  25. A B C D 0 1 2 3 Roaming node AA

  26. Scenarios – Roaming Node • Roaming node is source node • All other nodes stationary

  27. Results – Relay Node Swap

  28. Results – End Node Swap

  29. Results – Roaming Node

  30. AODV - UDP - End Node Swap

  31. OLSR - UDP - End Node Swap

  32. TCP - Simulation/Real World

  33. Observations • Simulation and Emulation similar in absolute CBR performance but not in relative protocol ranking • Real world CBR performance is significantly lower • Discrepancy grows with traffic complexity and scenario • TCP performance is orders of magnitude lower for real world compared to simulation • periods of no-progress time in real world

  34. Observations (continued) • OLSR tries less hard to re-route and therefore achieves more even performance • Radio factors account for most of the discrepancy between simulation and real world... • ...but secondary effects, such as cross-layer interactions that are protocol specific, dominate, e.g.: • Lost HELLOs (AODV) • Excessive buffering (DSR)

  35. Protocol comparison conclusion • If one protocol performs better than another in simulation, is it possible to assume the same for the real world? • NO

  36. Latency - Ping - Relay Node

  37. Flip-Flop Routing DSR Real Word Simulation

  38. Adapting to topology change

  39. Routing Control Overhead

  40. Conclusions • APE aims to address the lack of real world ad hoc experimental research test-beds • Repeatability addressed at a level that allows relative protocol comparisons • The value of cross-environment evaluation • Revealing of sensing problems leading to instabilities and poor performance • Not visible in simulations

  41. The End • Paper: • http://www.it.uu.se/research/group/core/publications/GC_technical_report.pdf • APE testbed: • http://apetestbed.sourceforge.net/ • The Research group: • http://www.it.uu.se/research/group/core/

  42. Extra Slides • More details…

  43. Self Interference Simulation

  44. UDP

  45. Ping

  46. TCP

More Related