slide1 n.
Skip this Video
Loading SlideShow in 5 Seconds..
A Real-World Test-bed for Mobile Ad hoc Networks: PowerPoint Presentation
Download Presentation
A Real-World Test-bed for Mobile Ad hoc Networks:

Loading in 2 Seconds...

play fullscreen
1 / 47

A Real-World Test-bed for Mobile Ad hoc Networks: - PowerPoint PPT Presentation

  • Uploaded on

A Real-World Test-bed for Mobile Ad hoc Networks: Methodology, Experimentations, Simulation and Results. Per Gunningberg, Erik Nordström, Christian Rohner, Oskar Wibling Uppsala University. Background and problem. IETF is standardizing MANET (Mobile Adhoc NETwork) routing protocols:

I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
Download Presentation

PowerPoint Slideshow about 'A Real-World Test-bed for Mobile Ad hoc Networks:' - winka

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

A Real-World Test-bed for

Mobile Ad hoc Networks:

Methodology, Experimentations, Simulation and Results.

Per Gunningberg, Erik Nordström, Christian Rohner, Oskar Wibling

Uppsala University

background and problem
Background and problem
  • IETF is standardizing MANET (Mobile Adhoc NETwork) routing protocols:
    • One proactive protocol - knowledge about all nodes
    • One reactive protocol - path on the need basis
  • Based on experiences from three protocols:
    • AODV - Adhoc On Demand Distance Vector (reactive)
    • DSR - Dynamic Source Routing (reactive)
    • OLSR - OptiMized Link State Routing(proactive)
  • Problem: But majority of research done through simulations...
part one
Part One
  • A test-bed for evaluating ad hoc routing protocols.
  • Close to reality
  • What to measure and how to analyze
  • Repeatable experiments
  • Grey Zone Phenomena
  • Conclusion
the uppsala ad hoc protocol evaluation testbed ape
The Uppsala Ad hoc Protocol Evaluation Testbed (APE)
  • People carrying laptops with 802.11b
  • Suitable for indoor experiments that are hard to model in simulation
the ad hoc protocol evaluation testbed ape
The Ad hoc Protocol Evaluation Testbed (APE)
  • Execution environment on top of existing OS.
    • Runs on Win and Linux
  • Scenarios with movement choreography.
  • Emphasizes easy management for scaling.
  • 800++ downloads.
laptop instructions choreography
Laptop instructions (choreography)

node.11.action.0.msg=Test is starting...



node.11.action.1.command=my_iperf c 2 t 330

node.11.action.1.msg=Stay at this location.


node.11.action.2.msg=Start moving! Go to Point A, the end of building.


node.11.action.3.msg=You should have arrived at Point A. Please stay.


measurement procedures
Measurement procedures
  • Every node collects SNR from every other node it can hear during the test session
  • Every event is time stamped
  • Received Packets/Application results are collected at all nodes
  • Routing state snapshots are collected
  • Analysis is done after the test session.
replaying a scenario
Replaying a scenario
  • SNR mapped to virtual distance
  • Each time interval corresponds to a topological map

Point D








Point A

ape is a testbed for
APE is a Testbed for…
  • Relative protocol performance comparisons
  • Radio channel effects on ad hoc routing protocols
  • Interactions between hardware, software, protocol, mobility and radio environment

Example: Grey Zone Phenomena

  • Validation of simulation models
  • Generation of traces
802 11 gray zone phenomena






802.11 Gray Zone Phenomena




  • Results should be reproducible and comparable between tests
  • It follows that experiments must be repeatable...
  • ...and therefore stochastic factors need to be dealt with
  • So – what can we achieve?
stochastic factors in real world experiments
Stochastic Factors in Real World Experiments
  • Node mobility adds frequent changes in the network topology.
    • We use choreography and “measure topology differences”
  • Variations in hardware and software configuration.
    • We use identical hardware and software.
  • Time varying radio environment affects link quality and error rates.
topology differences visual check
Topology differences - visual check

Experiment 1

Experiment 2

RED= Average mobility

GREEN = 25% with lowest mobility

BLUE = 25% with highest mobility

part two
Part Two
  • Evaluating MANET protocols with the APE testbed, simulation and emulation.
  • Scenarios
  • UDP, Ping and TCP
  • Side-by-side comparison
  • Faulty protocol constructs
  • Conclusion
routing protocols ability to adapt
Routing protocols ability to adapt
  • OLSR - Proactive Link state routing. Monitors neighbors and exchange link state info.
  • AODV - broadcasts to set up path. HELLO or Link feedback to detect link failure.
  • DSR - broadcasts with source route. Listens to other traffic to find shorter route. RTT measurements and network ACKs.

React to connectivity changes

  • Same configuration as Real world
  • Table-top emulation
  • MAC filters force connectivity changes
  • Reduces radio and mobility factors
  • Interference reduces bandwidth
  • Scenarios recreated in a ns2-simulation using “default” models:
    • Transmission range tuned to better match indoors
    • Mobility with jitter modeled after real world measurements
    • Results averaged over 10 runs
  • Results provide a baseline
  • Can simulations using default (simple) models be used to predict routing protocol performance in complex real world environments?
multidimensional comparison

3x3x3x(10 runs) = 270 runs

Multidimensional Comparison
  • Three MANET routing protocol implementations:
  • Three traffic types:
    • UDP (20 pkts/s CBR)
    • Ping (20 pkts/s CBR)
    • TCP (File transfer)
  • Three mobility scenarios:
    • End node swap, Relay node swap, Roaming node
  • Three environments (dimensions):
    • Simulation, Emulation, Real world
experimental test environment
Experimental Test Environment
  • Indoors with offices and corridors
  • Four nodes (0, 1, 2, 3)
  • Four waypoints (A, B, C, D)
  • One data stream from node 3 to node 0
scenarios relay node swap
Scenarios – Relay Node Swap
  • End nodes stationary
  • Intermediate nodes changes position
  • Hop count never smaller than 2
scenarios end node swap
Scenarios – End Node Swap
  • End nodes change positions
  • Intermediary nodes stationary
  • Hop count changes from 3 to (2) and 1 and back
roaming node









Roaming node


scenarios roaming node
Scenarios – Roaming Node
  • Roaming node is source node
  • All other nodes stationary
  • Simulation and Emulation similar in absolute CBR performance but not in relative protocol ranking
  • Real world CBR performance is significantly lower
    • Discrepancy grows with traffic complexity and scenario
  • TCP performance is orders of magnitude lower for real world compared to simulation
    • periods of no-progress time in real world
observations continued
Observations (continued)
  • OLSR tries less hard to re-route and therefore achieves more even performance
  • Radio factors account for most of the discrepancy between simulation and real world...
  • ...but secondary effects, such as cross-layer interactions that are protocol specific, dominate, e.g.:
    • Lost HELLOs (AODV)
    • Excessive buffering (DSR)
protocol comparison conclusion
Protocol comparison conclusion
  • If one protocol performs better than another in simulation, is it possible to assume the same for the real world?
  • NO
flip flop routing dsr
Flip-Flop Routing DSR

Real Word


  • APE aims to address the lack of real world ad hoc experimental research test-beds
  • Repeatability addressed at a level that allows relative protocol comparisons
  • The value of cross-environment evaluation
  • Revealing of sensing problems leading to instabilities and poor performance
  • Not visible in simulations
the end
The End
  • Paper:
  • APE testbed:
  • The Research group:
extra slides
Extra Slides
  • More details…