1 / 74

Last updated: Tuesday, October 29, 2019 Prof. Amir Herzberg

89-850 Communication Networks: Average-Case Analysis (Little’s Law, Stochastic Analysis and Queuing Theory). Last updated: Tuesday, October 29, 2019 Prof. Amir Herzberg Dept of Computer Science, Bar Ilan University http://AmirHerzberg.com. Overview: Average Case vs. Worst Case.

ethelvaldez
Download Presentation

Last updated: Tuesday, October 29, 2019 Prof. Amir Herzberg

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 89-850 Communication Networks: Average-Case Analysis (Little’s Law, Stochastic Analysis and Queuing Theory) Last updated: Tuesday, October 29, 2019 Prof. Amir Herzberg Dept of Computer Science, Bar Ilan University http://AmirHerzberg.com http://AmirHerzberg.com

  2. Overview: Average Case vs. Worst Case • Network calculus: worst case analysis • Worst-case bounds at every hop • Allows guaranteed QoS • But: can be wasteful • Our goal today: average case analysis • Little’s law (simplified): N=λT, where: • T is average time of packet in system • λ is average arrival rate • N is average # of packets in system • Stochastic analysis, esp. Markov chains • Queuing theory http://AmirHerzberg.com

  3. References for this subject • Data Networks, Bretsekas & Gallager, 2nd ed., 1992, chapter 3 • Communication Networking – an analytical approach, Kumar, Manjunath and Kuri, Elsevier, 2004: Chapter 5 and Appendix D • A bit also in: High-Speed Networks and Internets – Performance and QOS, 2nd ed., Stallings, 2002, chapters 7-9 • All: in library (folder / books) http://AmirHerzberg.com

  4. Buffer Server(s) Departures Arrivals Queued In Service Basic Queueing (Service) Model • A queue models any service station with: • One or multiple servers • A waiting area or buffer • Customers arrive to receive service • A customer that upon arrival does not find a free server waits in the buffer • Initially: no loss http://AmirHerzberg.com

  5. μ b Queues: General Observations • Increase in  leads to more packets in queue (on average) longer delays • Decrease in μ leads to longer delays to get processed, leads to more packets in queue • Decrease in b: • packet drops more likely • less delay for the “average” packet accepted into queue http://AmirHerzberg.com

  6. Little’s Theorem: Average Capacity Analysis of Service/Queuing Network • Analysis of a network of queues (service elements) with no loss • l: average customer (packet) arrival rate • N: average number of customers (packets) in system • T: average delay per customer in system (denoted W in [KMK]) • Little Theorem (simplified): N= lT • More arrivals (larger l)  more delay, customers in system • More delays  more customers in system • More customers  more delays N l T http://AmirHerzberg.com

  7. Little’s Theorem: General Formulation • l: customer arrival rate • N: average number of customers in system • T: average delay per customer in system • Little’s Theorem (simplified): System in steady-state N l T http://AmirHerzberg.com

  8. Little’s Law for Fixed Delay System • Simple, deterministic link, no queue: • Message arrives once every 1/ seconds • Processing and transmission take 1/ seconds • Assume < (1/ )>(1/) [why?] • Propagation: TProp seconds; assume integer • Nprop(t) number of propagating messages • Little’s Law (no Q, deterministic): Nprop(t)= · TProp • For t > TProp+1/   μ Tprop , Nprop (window) http://AmirHerzberg.com

  9. Little Law: Average Link Utilization • Analysis of a simple link with no queue: • Message arrive exactly every 1/ seconds • Processing and transmission take 1/ seconds • Assume < (1/ )>(1/) • Ntran(t) number of messages transmitted at time t • Ntran(t){0,1} • Ntran(t)=1 for 1/ seconds, then 0 for x=(1/ )-(1/) seconds •   avg(Ntran(t))=(1·1/+0·x)/(1/ )= / <1 • Link utilization = average # of packets transmitted • Little’s Law (average link util.):  = (1/)  μ http://AmirHerzberg.com

  10. A(t) N(t) D(t) t Counting Processes of a Queue • N(t) : number of customers in system at time t • A(t) : number of customer arrivals till time t • D(t) : number of customer departures till time t • Ti: time spent in system by the ith customer T3 http://AmirHerzberg.com

  11. A(t) N(t) D(t) t Analysis of Run (for Little’s Theorem) [1] T3 • N(t) : number of customers at time t • A(t) : number of arrivals till time t • D(t) : number of departures till time t • Ti: time spent by the ith customer Figure simplified: FIFO http://AmirHerzberg.com

  12. A(t) N(t) D(t) t Analysis (for Little’s Theorem) [2] Divide by time t for time average values: T3 • N(t) : number of customers at time t • A(t) : number of arrivals till time t • D(t) : number of departures till time t • Ti: time spent by the ith customer http://AmirHerzberg.com

  13. A(t) N(t) D(t) t Assumptions: Limits T, λExist andλ>0 We have: Assume following limits exist: • N(t) : number of customers at time t • A(t) : number of arrivals till time t • D(t) : number of departures till time t • Ti: time spent by the ith customer Also assume: λ>0  http://AmirHerzberg.com

  14. A(t) N(t) D(t) t Departures rate also converges to λ We assumed: • N(t) : number of customers at time t • A(t) : number of arrivals till time t • D(t) : number of departures till time t • Ti: time spent by the ith customer It follows that all customers depart and: Hence also: http://AmirHerzberg.com

  15. A(t) N(t) D(t) t Using Limits T, λandλ>0 Modify: To: Focusing on upper bound, take limit Since http://AmirHerzberg.com

  16. A(t) N(t) D(t) t Lower (and Tight) Bound We had: Add lower bound, again limit  http://AmirHerzberg.com

  17. Little’s Theorem • We proved part 1 of… • Little’s Theorem: • If in a run holds:then in this run exists:and holds: N=λT • If the limits λ, T exist and are the same for all runs [with probability 1], then N=λT is the average occupancy (for all runs with prob. 1). and http://AmirHerzberg.com

  18. Little’s Law: example • People arrive at a bank at an avg. rate of 5/min. They spend an average of 20 min in the bank. What is the average # of people in the bank at any time? • To keep the average # of people under 50, how much time should be spent by customers on average in the bank? =5, T=20, E[N] =  E[T] = 5(20) = 100 =5, E[N] < 50, E[T] = E[N] /  < 50 / 5 = 10 http://AmirHerzberg.com

  19. Example: Constant-Rate Switch C • Output rate C, arrival (and departure) rate λ • Switch is work-conserving (non-idlying) • Let Nl(t)=1 if link is serving, 0 else • Let Lkbe length of k-th packet, L: average • Average transmit time is L/C • From Little’s theorem: • Utility=arrival bit rate / output rate A ( t ) D ( t ) http://AmirHerzberg.com

  20. Overview • Little’s Theorem • Markov Chains: Memory-less Stochastic Process • The Poisson Process and Exponential Distribution • A/B/C/D (Kendall) Notation and the M/M/1 System http://AmirHerzberg.com

  21. 0 1 2 Markov Chain: Memoryless Process • Stochastic process that takes values in a countable set (finite or infinite) • Example: X(t){0,1,2,…,m}, or {0,1,2,…} • Elements represent possible “states” • Chain “jumps” from state to state • Memoryless (Markov) Property: Given the present state i, probability to “jump” to state j is fixed (independent of history), denoted Pij • Pijare the transition probabilities, and jPij=1 • Discrete- or continuous- time http://AmirHerzberg.com

  22. Continuous-Time Markov Chain • Stochastic (timed) process {X(t)|t0} • Values are integers (representing states) • Memoryless: • time in state i is exponential with parameter vi ; independent of history • Probability to change from state i to state j is fixed (independent of history), denoted Pij, and jPij=1 • Focus: Discrete-Time Markov Chain (DTMC) • For simplicity http://AmirHerzberg.com

  23. 0 1 2 Discrete-Time Markov Chain (DTMC) • Discrete-time stochastic process: {Xn: n = 0,1,2,…} • Discrete states: Xn {0,1,2,…} • DTMC: memoryless transition probability: • DTMC notation: • Time-homogeneous DTMC: • Transition probability matrixP=[Pij], s.t.: http://AmirHerzberg.com

  24. Example: Voice/Video Encoding • Time interval T between packets • Send Bk{0,1}packets of length L at time kT • Example (ITU G.729 voice encoder): • 10B frame every 10msec (8Kbps) • T=20ms, header 40B, 2 frames  L=60B • Common model: • Length of talk periods (Uj) and silence periods (Vj) geometrically distributed, namely: http://AmirHerzberg.com

  25. Example: Voice/Video Encoding (cont’) • Time interval T between packets = 20 msec • Geometrically-distributed talk (Uj) and silence (Vj) periods: • Assume: • Mean talk period = 400ms = T(1/τ)  τ=1/20 • Mean silence period = 600ms = T*(1/σ)  σ=1/30 • Source active 40% of time http://AmirHerzberg.com

  26. Exercise (5.1): Show Bkis a DTMC • Talk periods are geometrically distributed • Number of Bernoulli trials till change • Parameter τ : probability to change • Hence: • Similarly for silence periods: • How to find the mean byte rate? P0,1= σ 0 1 P1,0= τ http://AmirHerzberg.com

  27. Chapman-Kolmogorov Equations • n step transition probability: • is element (i, j) in matrix P(n) • Namely: • Chapman-Kolmogorov equations: • Theorem D.1: P(n)=Pn http://AmirHerzberg.com

  28. 0 1 2 4 3 Irreducible Markov Chains • Transition probabilities • Pij=P{Xn+1=j|Xn=i} • jPij=1 • State j is reachable from state i , if Denote: ij • States i,j communicate (ij) if for some n, n’ hold:Pijn>0 and Pjin’>0 i.e. ij , ji • Chain is irreducible if all states communicate • We focus on irreducible, DTMCs http://AmirHerzberg.com

  29. Recurrent State • Given DTMC with: • (countable) set of states S • Transition probability matrix P • For every i, j S and n>0, let: • Probability of moving from i to j in n steps • And let: • State j is recurrentif: • Visited  visited infinitely often http://AmirHerzberg.com

  30. Exercise (5.1): Continue P0,1= σ • Two communicating states •  Irreducible DTMC • State 0 is recurrent since: 0 1 P1,0= τ http://AmirHerzberg.com

  31. Positive Recurrent State • State j is recurrentif: • Visited  visited infinitely often • Recurrent state j is positive if: • Mean time between visits is finite http://AmirHerzberg.com

  32. Exercise (5.1): Continue P0,1= σ • Two communicating states •  Irreducible DTMC • State 0 is positive recurrent since: 0 1 P1,0= τ Question: what about state 1? Recurrent? Positive? http://AmirHerzberg.com

  33. Positive Recurrent DTMC • Theorem D.2: For irreducible DTMC, either all states, or no states, are positive (recurrent) • Irreducible DTMC is positive recurrent if all its states are • πis an invariant distribution on S, if π=πP • Theorem D.4: Irreducible DTMC is positive recurrent, if and only if it has (unique) positive invariant distribution π • Such DTMC is stationary if http://AmirHerzberg.com

  34. Properties of Invariant Distribution • π=πP • Namely: • The average number of steps in state jis πj • Namely: http://AmirHerzberg.com

  35. Exercise (5.1): Continue P0,1= σ • Invariant equations: 0 1 P1,0= τ http://AmirHerzberg.com

  36. Markov chain formulation i is the number of umbrellas available at her current location Transition matrix 0 2 1 Another Example: Finite Markov Chain Absent-minded professor uses two umbrellas when commuting between home and office. If it rains and an umbrella is available at her location, she takes it. If it does not rain, she always forgets to take an umbrella. Let p be the probability of rain each time she commutes. What is the probability that she gets wet on any given day? http://AmirHerzberg.com

  37. Professor Example: Solution 0 2 1 http://AmirHerzberg.com

  38. Properties of Invariant Distribution • The average number of steps in state jis πj • Namely: • What of ? • Gives utility, average rate!! • Does it converge? to πj ? • Not always • Give a counter-example • Convergence holds if DTMC is aperiodic http://AmirHerzberg.com

  39. 0 1 2 Aperiodic Markov Chains • The period djof state j is: • State j is aperiodic, if dj=1 • DTMC is aperiodic if all states are aperiodic • Theorem D.5: an irreducible DTMC is aperiodic, if any of its states is aperiodic • Theorem D.6: For irreducible positive DTMC, • For all states j holds: • If DTMC is aperiodic, then for all states i,j holds: http://AmirHerzberg.com

  40. Exercise (5.1): Continue P0,1= σ • The period djof state j is: • State j is aperiodic, if dj=1 • So is this aperiodic? (and why?) • Sure!! • So utility = time at state 1 = • Average rate at state 1 is L/T, at state 0 is 0 •  average rate is 0 1 P1,0= τ http://AmirHerzberg.com

  41. Global Balance Equations • Aperiodic, irreducible DTMC • With unique stationary distribution πj=IπiPij • Recall that jPij=1 hence also iPji=1 • Hence: πj·1= πji=0 Pji = i=0πi Pij • These are Global Balance Equations • Freq. of transition out of (every) state j = Freq. of transition into state j • Since the state must be j infinitely often (irreducible chain) (-1)≤|{transitions out of j}|-|{transitions into j}|≤1 http://AmirHerzberg.com

  42. Global Balance Equations for Set • Consider aperiodic, irreducible Markov chain • With unique stationary distribution πj=IπiPij • Global balance Equations: πji=0 Pji = i=0πi Pij • Sum over a set of states S  • Freq. of transitions out of S =Freq. of transitions into S http://AmirHerzberg.com

  43. . . . 0 3 1 2 Birth-death process/system • Markov Chain typical of queuing systems • Two successive states can differ only by one:Pi,j=0 if |i-j |>1. • A birth-death system is irreducible if and only if Pi,i+1>0 and Pi+1,i>0 for all i. • For irreducible, aperiodic birth-death systems hold simplified balance equations:πi Pi,i+1=πi+1 Pi+1,i States http://AmirHerzberg.com

  44. Exercise (5.1): Continue (done?) P0,1= σ • Balance equations: 0 1 P1,0= τ • Simplified balance equations:πi Pi,i+1=πi+1 Pi+1,i http://AmirHerzberg.com

  45. Wet Professor Example Revisited 0 2 1 http://AmirHerzberg.com

  46. μ μ μ Kendall’s notation: A/S/N[/b] A:  • Notation for types of queues • A is the arrival process • M = Markov / Memoryless; Poisson • D = deterministic (constant timebetween arrivals) • G = general (anything else) • S is the service process • M [exponential],D,G: as above • N is the number of parallel processors • b (optional) is size of queue • Drop when size is infinite b N S: http://AmirHerzberg.com

  47. Queue Descriptors: Examples • M/M/1: Poisson arrivals, exponentially distributed service times, one server, infinite buffer • M/M/m: Same but with m servers • M/G/1: Poisson arrivals, identically distributed service times follows a general distribution, one server, infinite buffer • */D/∞ : A constant delay system http://AmirHerzberg.com

  48. . . . 0 3 1 2 The M/M/1 Queue • a.k.a., M/M/1/∞ • Poisson arrivals • Exponential service time • 1 processor, infinite length queue • An irreducible, aperiodic birth-death Markov Chain • Simplify: discrete time analysis (intervals length δ) • Let Nkbe number of pkts in system at time kδ • Let Pij=Pr{Nk+1=j | Nk=i} [ignore dependency on δ] transition probabilities P0,1 # pkts in system (When > 1: one more than # pkts in queue) P3,4 P22 P00 P11 http://AmirHerzberg.com

  49. Poisson Arrivals Process • Probability of narrivals during (t, t+τ]: Memoryless http://AmirHerzberg.com

  50. Poisson Arrivals in Small Interval • Interval (t, t+d] of length d [0< d<1] o(δ) satisfies0=limδ0 [o(δ)/ δ] Proof: from Taylor series, e-δ=1- δ+(δ)2/2-… http://AmirHerzberg.com

More Related