1 / 15

Proactive Caching Strategies for IAPP Latency Improvement during 802.11 Handoff

Proactive Caching Strategies for IAPP Latency Improvement during 802.11 Handoff. Arunesh Mishra, Minho Shin, William Arbaugh University of Maryland College Park Insun Lee, Kyunghun Jang Samsung Electronics. STA. APs. PROBE PHASE. Probe requests. Probe responses. New AP. Other APs.

kawena
Download Presentation

Proactive Caching Strategies for IAPP Latency Improvement during 802.11 Handoff

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Proactive Caching Strategies for IAPP Latency Improvement during 802.11 Handoff Arunesh Mishra, Minho Shin, William Arbaugh University of Maryland College Park Insun Lee, Kyunghun Jang Samsung Electronics Mishra, Shin, Arbaugh, Lee, Jang

  2. STA APs PROBE PHASE Probe requests Probe responses New AP Other APs REASSOCIATION PHASE Reassociatiion request IAPP Reassociatiion response The Handoff Procedure • Probe Phase • STA scans for APs • Reassociation Phase • STA attempts to associate to preferred AP Mishra, Shin, Arbaugh, Lee, Jang

  3. The Handoff Procedure – Probe Phase • Empirical Results: • High latencies • Large variation Average Values of Handoff Latencies Variation in Handoff Latencies Mishra, Shin, Arbaugh, Lee, Jang

  4. STA New AP Old AP Reassociation Request Send Security Block IAPP Messages Ack Security Block Move Notify Move Response Reassociation Response The Handoff Proceduure- Reassociation Phase - IAPP Messages • Four IAPP Messages • IAPP Latency > 4 * RTT • Move Request and Move Response messages over TCP Mishra, Shin, Arbaugh, Lee, Jang

  5. Experimental Setup • OpenBSD on Soekris (www.soekris.com) based APs • Average Reassociation latency without IAPP on Cisco APs ≈ 2ms • Average Reassociation latency without IAPP on Soekris ≈ 55 ms • Average Reassociation latency without IAPP on Pentium 4 laptop ≈ 1.583 ms Mishra, Shin, Arbaugh, Lee, Jang

  6. 3 STA B STA’s path of motion 2 Security Context STA 1 Proactive Caching Algorithm • Key Idea : • Propagate security contexts to potential ‘next’ APs to eliminate IAPP latency during reassociation 1. STA associates to AP A 2. AP A sends security context to AP B proactively (new IAPP message) 3. STA moves to AP B – does fast reassociation since B has security context in cache A Mishra, Shin, Arbaugh, Lee, Jang

  7. Proactive Caching Algorithm – AP Neighborhood Graph • Two APs i and j are neighbors if • Exists a path of motion between i and j such that it is possible for a mobile STA to perform a reassociation • Captures the ‘potential next AP’ relationship • Distributed data-structure i.e. each AP maintains list of neighbors A B A E C E C B D D 1 Mishra, Shin, Arbaugh, Lee, Jang

  8. Proactive Caching – The Algorithm • When STA c associates/reassociates to AP i • If context(c) in cache: • Send Reassociate Response to client • Send Move-Notify to Old-AP • If context(c) not in cache, perform normal IAPP operation • Send security context to all Neighbours(i) • Cache Replacement : Least Recently Used • Cache size vendor dependent Mishra, Shin, Arbaugh, Lee, Jang

  9. IAPP Messages with Proactive Caching • STA reassociates to AP A • AP A has security context in cache • AP A propagates context to AP B (all neighbors of A) • STA reassociates to AP B which again has security context in cache STA AP A AP B Context in Cache Reassociation Request Reassociation Response Propagate context Context stored in cache Reassociation Request Context in Cache Reassociation Response Mishra, Shin, Arbaugh, Lee, Jang

  10. AP Neighborhood Graph – Automated Learning • Construction • Manual configuration for each AP or, • APs can learn: • If STA c sends Reassociate Request to AP i, with old-ap = AP j : • Create new neighbors (i,j) (i.e. an entry in AP i, for j and vice versa) • Learning costs only one ‘high latency handoff’ per edge in the graph • Enables mobility of APs, can be extended to wireless networks with an ad-hoc backbone infrastructure Mishra, Shin, Arbaugh, Lee, Jang

  11. Proactive Caching – Expected Performance • Handoff latencies play a role in performance when mobility is high • With LRU cache, higher the mobility, higher the cache-hit ratio (on average), implies larger number of fast-handoffs Cache Hit Ratio Mobility Mishra, Shin, Arbaugh, Lee, Jang

  12. Proactive Caching – Latency Improvements • Measurements based on Soekris/OpenBSD platform using Prism2 HostAP driver: • Basic Reassociation : 55.9 ms • Reassociation with IAPP : 273.8 ms • IAPP with Proactive-caching : 56.2 ms Mishra, Shin, Arbaugh, Lee, Jang

  13. Proactive Caching – Latency Improvements • Measurements based on Pentium 4 laptop (IBM Thinkpad T23) using Prism2 HostAP driver • Basic Reassociation : 1.583 ms • Reassociation with IAPP : 12.67 ms • IAPP with Proactive-caching : 1.982 ms Mishra, Shin, Arbaugh, Lee, Jang

  14. Latency Summary Chart Mishra, Shin, Arbaugh, Lee, Jang

  15. Conclusions • IAPP • Increases handoff latencies drastically (at least by a factor of 5-8 times) • Slow communication between APs can be a bottleneck • IAPP with Caching : • Provides security with performance • Performs better with higher mobility • Reduces inter-AP communication • All of this is accomplished without significant changes to IAPP, i.e. the addition of one message. Mishra, Shin, Arbaugh, Lee, Jang

More Related