1 / 33

Flows and Networks (158052)

Introduction to theorie of flows in complex networks: both stochastic and deterministic apects Size 5 ECTS 32 hours of lectures : 16 R.J. Boucherie focusing on stochastic networks 16 W. Kern focusing on deterministic networks Common problem

yachi
Download Presentation

Flows and Networks (158052)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Introduction to theorie of flows in complex networks: both stochastic and deterministic apects Size 5 ECTS 32 hours of lectures : 16 R.J. Boucherie focusing on stochastic networks 16 W. Kern focusing on deterministic networks Common problem How to optimize resource allocation so as to maximize flow of items through the nodes of a complex network Material: handouts / downloads Exam: exercises / (take home) exam References: see website Flows and Networks (158052) Richard Boucherie Stochastische Operations Research -- TW wwwhome.math.utwente.nl/~boucherierj/onderwijs/158052/158052.html

  2. Motivation Production / storage system Internet http://www.warriorsofthe.net/trailer Road traffic Main questions How to allocate servers / capacity to nodes or how to route jobs through the system to maximize system performance, such as throughput, sojourn time, utilization QUESTIONS ?? Motivation and main question

  3. Consider an open Jackson networkwith transition rates Assume that the service rates and arrival rates are given Let the costs per time unit for a job residing at queue j be Let the costs for routing a job from station i to station j be (i) Formulate the design problem (allocation of routing probabilities) as an optimisation problem. (ii) Provide the solution to this problem Aim: Optimal design of Jackson network

  4. Contents • Introduction; Markov chains • Birth-death processes; Poisson process, simple queue;reversibility; detailed balance • Output of simple queue; Tandem network; equilibrium distribution • Jackson networks;Partial balance • Sojourn time simple queue and tandem network • Performance measures for Jackson networks:throughput, mean sojourn time, blocking • Application: service rate allocation for throughput optimisationApplication: optimal routing Flows and network: stochastic networks

  5. Today: • Introduction / motivation course • Discrete-time Markov chain • Continuous time Markov chain • Birth-death process • Example: pure birth process • Example: pure death process • Simple queue • General birth-death process: equilibrium • Reversibility, stationarity • Truncation • Kolmogorov’s criteria • Summary / Next • Exercises

  6. Today: • Introduction / motivation course • Discrete-time Markov chain • Continuous time Markov chain • Birth-death process • Example: pure birth process • Example: pure death process • Simple queue • General birth-death process: equilibrium • Reversibility, stationarity • Truncation • Kolmogorov’s criteria • Summary / Next • Exercises

  7. Markov chain • Xn n=1,2,… stochastic process • State space : all possible states • Transition probability • Markov property • time-homogeneous • Property

  8. Markov chain : equilibrium distribution • n-step transition probability • Evaluate: • Chapman-Kolmogorov equation • n-step transition matrix • Initial distribution • Distribution at time n • Matrix form

  9. Markov chain: classification of states • j reachable from i if there exists a path from i to j • i and j communicate when j reachable from i and i reachable from j • State i absorbing if p(i,i)=1 • State i transient if there exists j such that j reachable from i and i not reachable from j • Recurrent state i process returns to i infinitely often = non transient state • State i periodic with period k>1 if k is smallest number such that all paths from i to i have length that is multiple of k • Aperiodic state: recurrent state that is not periodic • Ergodic Markov chain: alle states communicate, are recurrent and aperiodic (irreducible, aperiodic)

  10. Markov chain : equilibrium distribution • Assume: Markov chain ergodic • Equilibrium distributionindependent initial statestationary distribution • normalisinginterpretation probability flux

  11. Discrete time Markov chain: summary • stochastic process X(t) countable or finite state space SMarkov propertytime homogeneous independent tirreducible: each state in S reachable from any other state in Stransition probabilities Assume ergodic (irreducible, aperiodic) global balance equations (equilibrium eqns) solution that can be normalised is equilibrium distributionif equilibrium distribution exists, then it is unique and is limiting distribution

  12. Today: • Introduction / motivation course • Discrete-time Markov chain • Continuous time Markov chain • Birth-death process • Example: pure birth process • Example: pure death process • Simple queue • General birth-death process: equilibrium • Reversibility, stationarity • Truncation • Kolmogorov’s criteria • Summary / Next • Exercises

  13. Continuous time Markov chain • stochastic process X(t) countable or finite state space SMarkov propertytransition probabilityirreducible: each state in S reachable from any other state in SChapman-Kolmogorov equationtransition rates or jump rates

  14. Continuous time Markov chain • Chapman-Kolmogorov equationtransition rates or jump rates • Kolmogorov forward equations: (REGULAR)Global balance equations

  15. Markov jump chain • Hier tranparant met sprongketen, is nodig in bewijs verderop.

  16. Continuous time Markov chain: summary • stochastic process X(t) countable or finite state space SMarkov propertytransition rates independent tirreducible: each state in S reachable from any other state in SAssume ergodic and regular global balance equations (equilibrium eqns) π is stationary distribution solution that can be normalised is equilibrium distributionif equilibrium distribution exists, then it is unique and is limiting distribution

  17. Today: • Introduction / motivation course • Discrete-time Markov chain • Continuous time Markov chain • Birth-death process • Example: pure birth process • Example: pure death process • Simple queue • General birth-death process: equilibrium • Reversibility, stationarity • Truncation • Kolmogorov’s criteria • Summary / Next • Exercises

  18. Birth-death process • State space • Markov chain, transition rates • Bounded state space:q(J,J+1)=0 then states space bounded above at Jq(I,I-1)=0 then state space bounded below at I • Kolmogorov forward equations • Global balance equations

  19. Example: pure birth process • Exponential interarrival times, mean 1/ • Arrival process is Poisson process • Markov chain? • Transition rates : let t0<t1<…<tn<t • Kolmogorov forward equations for P(X(0)=0)=1 • Solution for P(X(0)=0)=1

  20. Example: pure death process • Exponential holding times, mean 1/ • P(X(0)=N)=1, S={0,1,…,N} • Markov chain? • Transition rates : let t0<t1<…<tn<t • Kolmogorov forward equations for P(X(0)=N)=1 • Solution for P(X(0)=N)=1

  21. Simple queue • Poisson arrival proces rate , single server exponential service times, mean 1/ • Assume initially empty:P(X(0)=0)=1, S={0,1,2,…,} • Markov chain? • Transition rates :

  22. Simple queue • Poisson arrival proces rate , single server exponential service times, mean 1/ • Kolmogorov forward equations, j>0 • Global balance equations, j>0

  23. Simple queue (ctd)   j j+1  Equilibrium distribution: < Stationary measure; summable  eq. distrib. Proof: Insert into global balance Detailed balance

  24. Birth-death process • State space • Markov chain, transition rates • Definition: Detailed balance equations • Theorem: A distribution that satisfies detailed balance is a stationary distribution • Theorem: Assume that then is the equilibrium distrubution of the birth-death prcess X.

  25. Today: • Introduction / motivation course • Discrete-time Markov chain • Continuous time Markov chain • Birth-death process • Example: pure birth process • Example: pure death process • Simple queue • General birth-death process: equilibrium • Reversibility, stationarity • Truncation • Kolmogorov’s criteria • Summary / Next • Exercises

  26. Reversibility; stationarity • Stationary process: A stochastic process is stationary if for all t1,…,tn, • Theorem: If the initial distribution is a stationary distribution, then the process is stationary • Reversible process: A stochastic process is reversible if for all t1,…,tn, NOTE: labelling of states only gives suggestion of one dimensional state space; this is not required

  27. Reversibility; stationarity • Lemma: A reversible process is stationary. • Theorem: A stationary Markov chain is reversible if and only if there exists a collection of positive numbers π(j), jS, summing to unity that satisfy the detailed balance equationsWhen there exists such a collection π(j), jS, it is the equilibrium distribution • Proof

  28. 10 S\A A Truncation of reversible processes Lemma 1.9 / Corollary 1.10: If the transition rates of a reversible Markov process with state space S and equilibrium distribution are altered by changing q(j,k) to cq(j,k) for where c>0 then the resulting Markov process is reversible in equilibrium and has equilibrium distribution where B is the normalizing constant. If c=0 then the reversible Markov process is truncated to A and the resulting Markov process is reversible with equilibrium distribution

  29. Time reversed process X(t) reversible Markov process  X(-t) also, but Lemma 1.11: tijdshomogeneity not inherited for non-stationary process Theorem 1.12 : If X(t) is a stationary Markov process with transition rates q(j,k), and equilibrium distribution π(j), jS, then the reversed processX(-t) is a stationary Markov process with transition ratesand the same equilibrium distribution Theorem 1.13: Kelly’s lemmaLet X(t) be a stationary Markov processwith transition rates q(j,k). If we can find a collection of numbers q’(j,k) such that q’(j)=q(j), jS, and a collection of positive numbers (j), jS, summing to unity, such thatthen q’(j,k) are the transition rates of the time-reversed process, and (j), jS, is the equilibrium distribution of both processes.

  30. Kolmogorov’s criteria • Theorem 1.8:A stationary Markov chain is reversible ifffor each finite sequence of states Notice that

  31. Today: • Introduction / motivation course • Discrete-time Markov chain • Continuous time Markov chain • Birth-death process • Example: pure birth process • Example: pure death process • Simple queue • General birth-death process: equilibrium • Reversibility, stationarity • Truncation • Kolmogorov’s criteria • Summary / Next • Exercises

  32. Summary / next: Basic queueing model; basic tools • Markov chains • Birth-death process • Simple queue • Reversibility, stationarity • Truncation • Kolmogorov’s criteria • Nextinput / output simple queuePoisson procesPASTAOutput simple queueTandem netwerkJackson networkPartial balanceKelly/Whittle network

  33. Exercises [R+SN] 1.1.2, 1.1.4, 1.1.5, 1.2.7, 1.2.8, 1.3.2, 1.3.3 (next time), 1.3.5, 1.3.6, 1.5.1, 1.5.2, 1.5.5, 1.6.2, 1.6.3, 1.6.4, 1.7.1, 1.7.8 (next time) [N] 10.1,6,7,8,9,10,12,13,15

More Related