1 / 20

Proxy Caching Mechanism for Multimedia Playback Streams in the Internet

Proxy Caching Mechanism for Multimedia Playback Streams in the Internet. R. Rejaie, M. Handley, H. Yu, D. Estrin USC/ISI http://netweb.usc.edu/reza/ WCW’99 April 1, 1999. Motivation. Rapid growth in deployment of realtime streams(audio/video) over the Internet Goals

Download Presentation

Proxy Caching Mechanism for Multimedia Playback Streams in the Internet

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Proxy Caching Mechanism for Multimedia Playback Streams in the Internet R. Rejaie, M. Handley, H. Yu, D. Estrin USC/ISI http://netweb.usc.edu/reza/ WCW’99 April 1, 1999

  2. Motivation • Rapid growth in deployment of realtime streams(audio/video) over the Internet • Goals • Maximize the quality of the delivered stream • Minimize startup latency • Low-latency VCR-functionality • Minimize the load on the server & the network

  3. Outline • An End-to-end Architecture • Multimedia Proxy Caching • Conclusion • Future Directions

  4. Streaming Applications in Best-effort Networks (The Internet) • End-to-end congestion control is crucial for stability, fairness & high utilization • Results in variable transmission rate • Streaming applications require constant average consumption rate • Streaming applications should be quality adaptive

  5. Quality Adaptation(QA) • Buffering only absorb short-term variations • Long-lived session could result in buffer overflow or underflow • QA is complementary for buffering • Adjust the quality(rate) with long-term variations • Layered framework BW(t) Time

  6. The End-to-end Architecture Server Client Error Control Quality Adaptation Cong. Control Acker Playback Buffer Internet Buffer Manager Buffer Manager Transmission Buffer Decoder Archive Adaptation Buffer Data path Control path

  7. L 4 L 3 L 2 L 1 L 0 Limitation • Delivered quality is limited to the average bandwidth between the server and client • Solutions: • Mirror servers • Proxy caching Client Client Client ISP Internet Server Quality(layer) Time

  8. Multimedia Proxy Caching • Assumptions • Proxy can perform: • End-to-end congestion ctrl • Quality Adaptation • Goals of proxy caching • Improve delivered quality • Low-latency VCR-functions • Natural benefits of caching Client Client Client Proxy Internet Server

  9. Played back stream Played back stream Stored stream Challenge • Cached streams have variable quality • Layered organization provides opportunity for adjusting quality L 4 L Quality (no. active layers) 3 L 2 L 1 L 0 Time

  10. Issues • Delivery procedure • Relaying on a cache miss • Pre-fetching on a cache hit • Replacement algorithm • Determining popularity • Replacement pattern

  11. Stream is located at the original server Playback from the server through the proxy Proxy relays and caches the stream No benefit in a miss scenario Cache Miss Scenario Client Client Client Proxy Internet Server

  12. Playback from the proxy cache Lower latency May have better quality! Available bandwidth allows: Lower quality playback Higher quality playback Cache Hit Scenario Client Client Client Proxy Internet Server

  13. Missing pieces of the active layers are pre-fetched on-demand Required pieces are identified by QA Results in smoothing Pre-fetched data Played back stream Stored stream Lower quality playback L 4 L Quality (no. active layers) 3 L 2 L 1 L 0 Time

  14. Pre-fetch higher layers on-demand Pre-fetched data is always cached Must pre-fetch a missing piece before its playback time Tradeoff Pre-fetched data Played back Stream Stored stream Higher quality playback L 4 L Quality (no. active layers) 3 L 2 L 1 L 0 Time

  15. Replacement Algorithm • Goal: converge the cache state to optimal • Average quality of a cached stream depends on • popularity • average bandwidth between proxy and recent interested clients • Variation in quality inversely depends on • popularity Client Client Client Proxy Internet Server

  16. Popularity • Number of hits during an interval • User’s level of interest (including VCR-functions) • Potential value of a layer for quality adaptation • Calculate whit on a per-layer basis • Layered encoding guarantees monotonically decrease in popularity of layers whit = PlaybackTime(sec)/StreamLength(sec)

  17. Replacement Pattern • Multi-valued replacement decision for multimedia object • Coarse-grain flushing • on a per-layer basis • Fine-grain flushing • on a per-segment basis Cached segment Fine-grain Quality(Layer) Coarse-grain Time

  18. Conclusion • End-to-end architecture for delivery of quality-adaptive multimedia streams • Congestion control & Quality adaptation • Proxy caching mechanism for multimedia streams • Pre-fetching • Replacement algorithm • State of the cache converges to the optimal

  19. Future Directions • Extensive simulation(using VINT/ns) • e.g. access pattern, the bandwidth distribution • Exploring other replacement patterns • Chunk-based popularity function

  20. Alternative Replacement Algorithm • Goal: to cache popular portion of each stream • Keep track of per-chunk popularity • Identify a victim chuck • Apply the same replacement pattern within the victim chunk

More Related