1 / 34

Introduction to Stochastic Models GSLM 54100

Introduction to Stochastic Models GSLM 54100. Outline. independence of random variables variance and covariance two useful ideas examples conditional distribution. Independent Random Variables.

Download Presentation

Introduction to Stochastic Models GSLM 54100

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Introduction to Stochastic ModelsGSLM 54100 1

  2. Outline • independence of random variables • variance and covariance • two useful ideas • examples • conditional distribution 2

  3. Independent Random Variables • two random variables X and Y being independent  all events generated by X and Y being independent • discrete X and Y P(X = x, Y = y) = P(X = x) P(Y = y) for all x, y • continuous X and Y fX ,Y(x, y) = fX(x) fY(y) for all x, y • any X and Y FX ,Y(x, y) = FX(x) FY(y) for all x, y 3

  4. Proposition 2.3 • E[g(X)h(Y)] = E[g(X)]E[h(Y)] for independent X, Y • different meanings of E() • Ex #7 of WS #5 (Functions of independent random variables) • X and Y be independent and identically distributed (i.i.d.) random variables equally likely to be 1, 2, and 3 • Z = XY • E(X) = ? E(Y) = ? distribution of Z? E(Z) = E(X)E(Y)?  E(Z) as the mean of a function of X and Y, or as the mean of a random variable Z 4

  5. Proposition 2.3 • E[g(X)h(Y)] = E[g(X)]E[h(Y)] for independent X, Y • different meanings of E() • E[g(X)] = • E[h(Y)] = • E[g(X)h(Y)] = x and y are dummy variables 5

  6. Variance and Covariance (Ross, pp 52-53) • Cov(X, Y) = E(XY)  E(X)E(Y) • Cov(X, X)= Var(X) • Cov(X, Y)= Cov(Y, X) • Cov(cX, Y)= cCov(X, Y) • Cov(X, Y + Z)= Cov(X, Y)+ Cov(X, Z) • Cov(iXi, jYj) = i j Cov(Xi, Yj) • . 6

  7. Two Useful Ideas 7

  8. Two Useful Ideas • for X = X1 + … + Xn, E(X) = E(X1) + … + E(Xn), no matter whether Xi are independent or not • for a prize randomly assigned to one of the n lottery tickets, the probability of winning the price = 1/n for all tickets • the order of buying a ticket does not change the probability of winning 8

  9. Applications of the Two Ideas • the following are interesting applications • mean of Bin(n, p) (Ex #7(b) of WS #8) • variance of Bin(n, p) (Ex #8(b) of WS #8) • the probability of winning a lottery (Ex #3(b) of WS #9) • mean of hypergeometric random variable (Ex #4 of WS #9) • mean and variance of random number of matches (Ex #5 of WS #9) 9

  10. Mean of Bin(n, p) Ex #7(b) of WS #8 • X ~ Bin(n, p) • find E(X) from E(I1+…+In) • E(X) = E(I1+…+In) = np 10

  11. Variance of Bin(n, p) Ex #8(b) of WS #8 • X ~ Bin(n, p) • find V(X) from V(I1+…+In) • V(X) = V(I1+…+In) = nV(I1) = np(1p) 11

  12. Probability of Winning a Lottery Ex #3(b) & (c) of WS #9 • a grand prize among n lotteries • (b) Let n 3. Find the probability that the third person who buys a lottery wins the grand prize • (c). Let Ii = 1 if the ith person buys the lottery wins the grand prize, and Ii = 0 otherwise, 1 in • (i). Show that all Ii have the same (marginal) distribution • Find cov(Ii, Ij) for ij • Verify 12

  13. Probability of Winning a Lottery Ex #3(b) & (c) of WS #9 • (b) A = the third person buying a lottery wins the grand prize • find P(A) when there are 3 persons • Sol. P(A) = • actually the order does not matter • thinking about randomly throwing a ball into one of three boxes 13

  14. Probability of Winning a Lottery Ex #3(b) & (c) of WS #9 • (c)(i). P(Ij = 1) = 1/n for any j • . • for ij, cov(Ii, Ij) = E(IiIj) E(Ii)E(Ij) • E(IiIj) = 0  cov(Ii, Ij) = -1/n2 • checking: 14

  15. Hypergeometric in the Context of Ex #4 of WS #9 • 3 balls are randomly picked from 2 white & 3 black balls • X = the total number of white balls picked E(X) = 6/5 15

  16. Hypergeometric in the Context of Ex #4 of WS #9 • Ex #4(c). Assume that the three picked balls are put in bins 1, 2, and 3 in the order of being picked • (i). Find P(bin i contains a white ball), i = 1, 2, & 3 • (ii). Define Bi = 1 if the ball in bin i is white in color, i = 1, 2, and 3. Find E(X) by relating X to B1, B2, and B3 16

  17. Hypergeometric in the Context of Ex #4 of WS #9 • (i). P(bin i contains a white ball) = 2/5 • each ball being equally likely to be in bin i • (ii). Bi = 1 if the ball in bin i is white in color, and = 0 otherwise • X = B1 + B2 + B3 • E(Bi) = P(bin i contains a white ball) = 2/5 • E(X) = E(B1) + E(B2) + E(B3) = 6/5 17

  18. Hypergeometric in the Context of Ex #4 of WS #9 • Ex #4(d). Arbitrarily label the white balls as 1 and 2. •  (i). Find P(white ball 1 is put in a bin); find P(white ball 2 is put in a bin) • (ii). let Wi = 1 if the white ball i is put in a bin, and Wi = 0 otherwise, i = 1, 2; find E(X) from Wi 18

  19. Hypergeometric in the Context of Ex #4 of WS #9 • (i) P(white ball 1 is put in a bin) = 3/5 • each ball being equally likely to be in a bin • (ii) Wi = 1 if the white ball i is put in a bin, and Wi = 0 otherwise, i = 1, 2. Find E(X) by relating X to W1 and W2 • X = W1 + W2 • E(Wi) = P(white ball 1 is put in a bin) = 3/5 • E(X) = E(W1) + E(W2) = 6/5 19

  20. Mean and Variance of Random Number of Matches Ex #5 of WS #9 • gift exchange among n participants • X = total # of participants who get back their own gifts • (a). Find P(the ith participant gets back his own gift) • (b). Let Ii = 1 if the ith participant get back his own gift, and Ii = 0 otherwise, 1 in. Relate X to I1, …, In • (c). Find E(X) from (b) • (d). Find cov(Ii, Ij) for ij • (e). Find V(X) 20

  21. Mean and Variance of Random Number of Matches Ex #5 of WS #9 • (a). P(the ith participant gets back his own gift) = 1/n • each hat being equally likely be picked by the person • (b). Ii = 1 if the ith participant get back his own gift, and Ii = 0 otherwise, 1 in;  X = I1 + …+ In • (c). E(X) = E(I1+ …+In) = 1 • (d). for ij, cov(Ii, Ij) = E(IiIj)  E(Ii)E(Ij) • E(IiIj) = P(Ii = 1, Ij = 1) = P(Ii = 1|Ij = 1)P(Ij = 1) = 1/[n(n-1)] • cov(Ii, Ij) = 1/[ n2(n-1)] • (e). V(X) = 21

  22. Example 1.11 of Ross It is still too complicated to discuss. Let us postpone its discussion until covering the condition probability and the condition probability 22

  23. Chapter 2 • material to read: from page 21 to page 59 (section 2.5.3) • Examples highlighted: Examples 2.3, 2.5, 2.17, 2.18, 2.19, 2.20, 2.21, 2.30, 2.31, 2.32, 2.34, 2.35, 2.36, 2.37 • Sections and material highlighted: 2.2.1, 2.2.2, 2.2.3, 2.2.4, 2.3.1, 2.3.2, 2.3.3, 2.4.3, Proposition 2.1, Corollary 2.2, 2.5.1, 2.5.2, Proposition 2.3, 2.5.3, Properties of Covariance 23

  24. Chapter 2 • Exercises #5, #11, #20, #23, #29, #37, #42, #43, #44, #45, #46, #51, #57, #71, #72 24

  25. Conditional Distributions 25

  26. Conditional Distribution • X ~ {pn} and A is an event • 0  P(X = n|A)  1 • nP(X = n|A) = •  {P(X = n|A)} is a probability mass function, called the conditional distribution of X given A 26

  27. Conditional Distribution • define Z = (X|A) • Z is a random variable • E(Z) and Var(Z) being well-defined • E(X|A), the conditional mean of X given A • Var(X|A), the conditional variance of X given A • event A can defined by a random variable, e.g., A = {Y = 3} 27

  28. Ex #1 of WS #5 • Exercise 1. (Joint and conditional distributions) The joint distribution of X and Y is shown below, where pm,n = P(X = m, Y = n). • p1,1 = 0; p1,2 = 1/8; p1,3 = 1/8; • p2,1 = 1/4; p2,2 = 1/4; p2,3 = 0; • p3,1 = 1/8; p3,2 = 0; p3,3 = 1/8. • Find the (marginal) distribution of X. • Find the (marginal) distribution of Y. • Find the conditional distribution of (X|Y = 1), (X|Y = 2), and (X|Y = 3). • Find the conditional means E(X|Y = 1), E(X|Y = 2), and E(X|Y = 3). • Find the conditional variances V(X|Y = 1), V(X|Y = 2), and V(X|Y = 3). 28

  29. Ex #1 of WS #5 • Exercise 1. (Joint and conditional distributions) The joint distribution of X and Y is shown below, where pm,n = P(X = m, Y = n). • p1,1 = 0; p1,2 = 1/8; p1,3 = 1/8; • p2,1 = 1/4; p2,2 = 1/4; p2,3 = 0; • p3,1 = 1/8; p3,2 = 0; p3,3 = 1/8. • distribution of X: p1 = 1/4, p2 = 1/2, p3 = 1/4 • distribution of Y: p1 = 3/8, p2 = 3/8, p3 = 1/4 29

  30. Ex #1 of WS #5 • Exercise 1. (Joint and conditional distributions) The joint distribution of X and Y is shown below, where pm,n = P(X = m, Y = n). • p1,1 = 0; p1,2 = 1/8; p1,3 = 1/8; • p2,1 = 1/4; p2,2 = 1/4; p2,3 = 0; • p3,1 = 1/8; p3,2 = 0; p3,3 = 1/8. • conditional distribution of • (X|Y = 1): p(X=1|Y=1) = 0; p(X=2|Y=1) = 2/3; p(X=3|Y=1) = 1/3 • (X|Y = 2): p(X=1|Y=2) = 1/3; p(X=2|Y=2) = 2/3; p(X=3|Y=2) = 0 • (X|Y = 3): p(X=1|Y=3) = 1/2; p(X=2|Y=3) = 0; p(X=3|Y=3) = 1/2 30

  31. Ex #1 of WS #5 • Exercise 1. (Joint and conditional distributions) The joint distribution of X and Y is shown below, where pm,n = P(X = m, Y = n). • p1,1 = 0; p1,2 = 1/8; p1,3 = 1/8; • p2,1 = 1/4; p2,2 = 1/4; p2,3 = 0; • p3,1 = 1/8; p3,2 = 0; p3,3 = 1/8. • (X|Y = 1) being a random variable with well-defined distribution • the conditional means being well-defined • E[(X|Y = 1)] = (2)(2/3)+(3)(1/3) = 7/3 • E[(X|Y = 2)] = 5/3 • E[(X|Y = 3)] = 2 31

  32. Ex #1 of WS #5 • Exercise 1. (Joint and conditional distributions) The joint distribution of X and Y is shown below, where pm,n = P(X = m, Y = n). • p1,1 = 0; p1,2 = 1/8; p1,3 = 1/8; • p2,1 = 1/4; p2,2 = 1/4; p2,3 = 0; • p3,1 = 1/8; p3,2 = 0; p3,3 = 1/8. • (X|Y = 1) being a random variable with well-defined distribution • the conditional variances being well-defined • V(X|Y = 1) = E(X2|Y = 1) E2(X|Y = 1) = 2/9 • V(X|Y = 2) = 2/9 • V(X|Y = 3) =1 32

  33. Ex #1 of WS #5 • note the mapping defined by the conditional means E[(X|Y = 1)] = 7/3, E[(X|Y = 2)] = 5/3, E[(X|Y = 3)] = 2 • at {1|Y(1) = 1}, the mapping gives 7/3 • at {2|Y(2) = 2}, the mapping gives 5/3 • at {3|Y(3) = 3}, the mapping gives 2 • the mapping E(X|Y), i.e., the conditional mean, defines a random variable • E[E(X|Y)] = (3/8)(7/3)+(3/8)(5/3)+(1/4)(2) = 2 • incidentally E(X) = 2 33

  34. Ex #1 of WS #5 • note the mapping defined by the conditional means V[(X|Y = 1)] = 2/9, V[(X|Y = 2)] = 2/9, V[(X|Y = 3)] = 1 • at {1|Y(1) = 1}, the mapping gives 2/9 • at {2|Y(2) = 2}, the mapping gives 2/9 • at {3|Y(3) = 3}, the mapping gives 1 • the mapping V(X|Y), i.e., the conditional variance, defines a random variable • E[V(X|Y)] = (3/8)(2/9)+(3/8)(2/9)+(1/4)(1) = 5/12 34

More Related