1 / 23

Markov Modulated Fluid Flow Analysis

Markov Modulated Fluid Flow Analysis. G.U. Hwang Next Generation Communication Networks Lab. Department of Mathematical Sciences KAIST. References.

blake
Download Presentation

Markov Modulated Fluid Flow Analysis

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Markov Modulated Fluid Flow Analysis G.U. Hwang Next Generation Communication Networks Lab. Department of Mathematical Sciences KAIST

  2. References [1] D. Anick, D. Mitra and M.M. Sondhi, Stochastic theory of a data-handling system with multiple sources, The Bell System Technical Journal, Vol 61, No. 8, 1871--1894, 1982. [2] D. Mitra, Stochastic theory of a fluid model of producers and consumers coupled by a buffer, Advances in Applied Probability, Vol. 20, 146--176, 1988. [3] T. Stern and A. Elwalid, Analysis of separable Markov-modulated rate models for information-handling systems, Advances in Applied Probability, Vol 23, 105--139, 1991. Next Generation Communication Networks Lab.

  3. Markov modulated Fluid Flow Model • Consider an irreducible CTMC Yt with state space {1,2,,n} having infinitesimal generator Q = (qij). • When Yt = i, the fluid input rate is constant Ri. In this case, the source is called a Markov modulated rate process (or a Markov modulated fluid model). • The service rate of fluid is always constant c. Next Generation Communication Networks Lab.

  4. Let Xt denote the buffer content. Then we know buffer server fluid input c Next Generation Communication Networks Lab.

  5. Analysis • Let • Since the underlying CTMC Yt is irreducible and finite, its limiting probabilities exists. • the mean arrival rate R and the offered load  Next Generation Communication Networks Lab.

  6. Infinite buffer model analysis • Let • since the buffer size is infinite, • For simplicity, in the analysis we assume R1 < R2 <  < Rn. Next Generation Communication Networks Lab.

  7. Observe that the net input rate is Ri-C when Yt = i. Then we have from which we also get Next Generation Communication Networks Lab.

  8. By letting  t ! 0 we get • In matrix form, which is, in fact, the Kolmogorov's differential equation of our system. Next Generation Communication Networks Lab.

  9. To find the limiting probabilities, we let t!1 and then set /t limt!1 p(t;x)=0 to obtain • Assume that the matrix A = QD-1 is orthonormally diagonalizable, that is, there exists an invertible matrix B such that BAB-1 =  where  is a diagonal matrix. Next Generation Communication Networks Lab.

  10. Remark: • Consider eigenvalues i, 1· i · n and their corresponding (left) eigenvectors bi of the matrix A • Let • Then we have B = BA. • If B is invertible, i.e.,B-1 exists, then  = BAB-1 Next Generation Communication Networks Lab.

  11. Since we know the solution of the differential equation is given by it immediately follows Next Generation Communication Networks Lab.

  12. If we let the i-th row of matrix B is given by bi, that is, using the fact that we get where a = (a1,a2,,an) = c B-1 . Next Generation Communication Networks Lab.

  13. Remarks • The diagonal elements i are the eigenvalues of matrix A. • If we can find n linearly independent eigenvectors bi, each of which is corresponding to eigenvalue i of A, the matrix A is orthonormally diagonalizable. • When all the eigenvalues i are different from each other, we have n linearly independent eigenvectors. • In conclusion, to solve the differential equation we need to find the eigenvalues and eigenvectors of the matrix A= QD-1. Next Generation Communication Networks Lab.

  14. Two important conditions • Condition 1. Since F(x) is bounded, we have In addition, since 0 is an eigenvalue of A, we let 1 = 0. and consequently, Next Generation Communication Networks Lab.

  15. Condition 2. For overload state i, i.e., Ri > C, In practice, we see that there are as many negative eigenvalues i as overload states. Then we have n-k (linearly independent) equations determining the unknown coefficients ai. Next Generation Communication Networks Lab.

  16. overflow probability • For sufficiently large buffer size x we have where n is the largest number among negative i. In this case, we call • n: the asymptotic decay rate of the buffer content. • c : the asymptotic decay constant of the buffer content. Next Generation Communication Networks Lab.

  17. Example: 2-state fluid flow • consider, for R > C • since from solving det( I – Q D-1) = 0 we get Next Generation Communication Networks Lab.

  18. In order to be 1 < 0 we should have • c.f. the input rate : • hence, the input rate < the service rate Next Generation Communication Networks Lab.

  19. Then we have • From the equation 1 b = b Q D-1 where b = (b0,b1), we get b1/b0 = C/(R-C). • Therefore, we finally have Next Generation Communication Networks Lab.

  20. From F1(0) = 0, • Hence Next Generation Communication Networks Lab.

  21. 1/0 = 15, 1/1 = 11, R = 2, C = 1 Next Generation Communication Networks Lab.

  22. Homework – Fluid flow(1/2) • consider a superposition of N independent on and off sources • J(t) = the number of active sources at time t • when the state is k, the arrival rate is k. • Assume that the service rate is C. ….. 0 1 2 N-1 N Next Generation Communication Networks Lab.

  23. Let X be the buffer content in the steady state. • Make a program to compute the tail probability • Plot the tail probabilities for 0· x · 100. • Compare the results with those in the QBD analysis. Next Generation Communication Networks Lab.

More Related