1 / 35

Introduction to Stochastic Models GSLM 54100

Introduction to Stochastic Models GSLM 54100. Outline. conditional probability conditional distribution recursive relationships. Conditional Probability. Example 3.1.3. a car behind one of the 3 doors, I , II , and III door I chosen by a guest, getting whatever behind the door

harlant
Download Presentation

Introduction to Stochastic Models GSLM 54100

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Introduction to Stochastic ModelsGSLM 54100 1

  2. Outline • conditional probability • conditional distribution • recursive relationships 2

  3. Conditional Probability 3

  4. Example 3.1.3 • a car behind one of the 3 doors, I, II, and III • door I chosen by a guest, getting whatever behind the door • door III opened by the host, empty behind   • the guest given an option to switch his choice • good for the guest to switch? 4

  5. Example 3.1.3 • the correct answer dependent on the assumption of the problem • assumption 1: an empty door deliberately opened by the host • defining events • G = door IIIopened (given that door I is chosen) • Gc = door IIopened (given that door I is chosen) • A (Band C) = the car behind door I (IIand III) 5

  6. Example 3.1.3 • P(G) • = P(G|A)P(A) + P(G|B)P(B) + P(G|C)P(C) • = 1/2 • . • . 6

  7. Example 3.1.3 • assumption 2: one of the three doors randomly opened by the host, which incidentally was an empty door III • P(G|A) = P(G|B) = P(G|C) = 1/3 • P(G) = P(G|A)P(A) + P(G|B)P(B) + P(G|C)P(C) • G being independent of A, B, and C • . • . 7

  8. Conditional Distributions 8

  9. Conditional Distribution • X ~ {pn} and A is an event • 0  P(X = n|A)  1 • nP(X = n|A) = •  {P(X = n|A)} is a probability mass function, called the conditional distribution of X given A 9

  10. Conditional Distribution • event A, discrete random variables X & Y • P(X = x|A) = the conditional p.m.f. of X given A • P(X = x|Y = y) = the conditional p.m.f. of X given Y = y 10

  11. Conditional Distribution • event A, continuous random variables X & Y • fX|A(x|A) = the conditional density function of X given A • fX|Y(x|y) = the conditional density function of X given Y = y 11

  12. Modified from Example 3.2 of Ross • X1 ~ Bin(5, 0.3), X2 ~ Bin(10, 0.3), independent • P(X1 = 2|X1+X2 = 8) = (X1| X1+X2=8) ~ hypergeometric 12

  13. Preparation for Example 3.3 of Ross • X ~ Poisson(λ); Y ~ Poisson(); independent • distribution of X+Y i.e., X+Y ~ Poisson(+) 13

  14. Example 3.3 of Ross • X ~ Poisson(λ1); Y ~ Poisson(λ2); independent • find (X|X + Y = n) (X|X + Y = n) ~ Bin(n, 1/(1+2)) 14

  15. Conditional Distribution • X ~ {pn} and A is an event • 0  P(X = n|A)  1 • nP(X = n|A) = •  {P(X = n|A)} is a probability mass function, called the conditional distribution of X given A 15

  16. Conditional Distribution • define Z = (X|A) • Z is a random variable • E(Z) and Var(Z) being well-defined • E(X|A), the conditional mean of X given A • Var(X|A), the conditional variance of X given A • event A can defined by a random variable, e.g., A = {Y = 3} 16

  17. Ex #1 of WS #5 • Exercise 1. (Joint and conditional distributions) The joint distribution of X and Y is shown below, where pm,n = P(X = m, Y = n). • p1,1 = 0; p1,2 = 1/8; p1,3 = 1/8; • p2,1 = 1/4; p2,2 = 1/4; p2,3 = 0; • p3,1 = 1/8; p3,2 = 0; p3,3 = 1/8. • Find the (marginal) distribution of X. • Find the (marginal) distribution of Y. • Find the conditional distribution of (X|Y = 1), (X|Y = 2), and (X|Y = 3). • Find the conditional means E(X|Y = 1), E(X|Y = 2), and E(X|Y = 3). • Find the conditional variances V(X|Y = 1), V(X|Y = 2), and V(X|Y = 3). 17

  18. Ex #1 of WS #5 • Exercise 1. (Joint and conditional distributions) The joint distribution of X and Y is shown below, where pm,n = P(X = m, Y = n). • p1,1 = 0; p1,2 = 1/8; p1,3 = 1/8; • p2,1 = 1/4; p2,2 = 1/4; p2,3 = 0; • p3,1 = 1/8; p3,2 = 0; p3,3 = 1/8. • distribution of X: p1 = 1/4, p2 = 1/2, p3 = 1/4 • distribution of Y: p1 = 3/8, p2 = 3/8, p3 = 1/4 18

  19. Ex #1 of WS #5 • Exercise 1. (Joint and conditional distributions) The joint distribution of X and Y is shown below, where pm,n = P(X = m, Y = n). • p1,1 = 0; p1,2 = 1/8; p1,3 = 1/8; • p2,1 = 1/4; p2,2 = 1/4; p2,3 = 0; • p3,1 = 1/8; p3,2 = 0; p3,3 = 1/8. • conditional distribution of • (X|Y = 1): p(X=1|Y=1) = 0; p(X=2|Y=1) = 2/3; p(X=3|Y=1) = 1/3 • (X|Y = 2): p(X=1|Y=2) = 1/3; p(X=2|Y=2) = 2/3; p(X=3|Y=2) = 0 • (X|Y = 3): p(X=1|Y=3) = 1/2; p(X=2|Y=3) = 0; p(X=3|Y=3) = 1/2 19

  20. Ex #1 of WS #5 • Exercise 1. (Joint and conditional distributions) The joint distribution of X and Y is shown below, where pm,n = P(X = m, Y = n). • p1,1 = 0; p1,2 = 1/8; p1,3 = 1/8; • p2,1 = 1/4; p2,2 = 1/4; p2,3 = 0; • p3,1 = 1/8; p3,2 = 0; p3,3 = 1/8. • (X|Y = 1) being a random variable with well-defined distribution • the conditional means being well-defined • E[(X|Y = 1)] = (2)(2/3)+(3)(1/3) = 7/3 • E[(X|Y = 2)] = 5/3 • E[(X|Y = 3)] = 2 20

  21. Ex #1 of WS #5 • Exercise 1. (Joint and conditional distributions) The joint distribution of X and Y is shown below, where pm,n = P(X = m, Y = n). • p1,1 = 0; p1,2 = 1/8; p1,3 = 1/8; • p2,1 = 1/4; p2,2 = 1/4; p2,3 = 0; • p3,1 = 1/8; p3,2 = 0; p3,3 = 1/8. • (X|Y = 1) being a random variable with well-defined distribution • the conditional variances being well-defined • V(X|Y = 1) = E(X2|Y = 1) E2(X|Y = 1) = 2/9 • V(X|Y = 2) = 2/9 • V(X|Y = 3) =1 21

  22. Ex #1 of WS #5 • note the mapping defined by the conditional means E[(X|Y = 1)] = 7/3, E[(X|Y = 2)] = 5/3, E[(X|Y = 3)] = 2 • at {1|Y(1) = 1}, the mapping gives 7/3 • at {2|Y(2) = 2}, the mapping gives 5/3 • at {3|Y(3) = 3}, the mapping gives 2 • the mapping E(X|Y), i.e., the conditional mean, defines a random variable • E[E(X|Y)] = (3/8)(7/3)+(3/8)(5/3)+(1/4)(2) = 2 • incidentally E(X) = 2 22

  23. Ex #1 of WS #5 • note the mapping defined by the conditional means V[(X|Y = 1)] = 2/9, V[(X|Y = 2)] = 2/9, V[(X|Y = 3)] = 1 • at {1|Y(1) = 1}, the mapping gives 2/9 • at {2|Y(2) = 2}, the mapping gives 2/9 • at {3|Y(3) = 3}, the mapping gives 1 • the mapping V(X|Y), i.e., the conditional variance, defines a random variable • E[V(X|Y)] = (3/8)(2/9)+(3/8)(2/9)+(1/4)(1) = 5/12 23

  24. A former Mid-Term Question (Compared to Example 3.4 in Ross ) • three types of cakes, chocolate, mango, and strawberry in a bakery • each customer choosing chocolate, mango, and strawberry w.p. 1/2, 1/3, and 1/6, respectively, independent of everything else • profit from each piece of chocolate, mango, and strawberry cake ~ $3, $2, and $1, respectively • 4 cream cakes sold on a particular day • (a). Let Xc be the number of chocolate cream cakes sold on that day. Find the distribution of Xc. • (b). Find the expected total profit of the day from the 4 cream cakes. • (c). Given that no chocolate cream cake is sold on that day, find the variance of the total profit of the day. 24

  25. A former Mid-Term Question (Compared to Example 3.4 in Ross ) • (a). Xc = the number of chocolate cream cakes sold on that day • Xc ~ Bin(4, 1/2) • (b). E(total profit of the day) = E(3Xc+ 2Xm + Xs) = 3E(Xc) + 2E(Xm) + E(Xs) = 6+(8/3)+(2/3) = 28/3. 25

  26. A former Mid-Term Question (Compared to Example 3.4 in Ross ) • (c). Given that no chocolate cream cake is sold on that day, find the variance of the total profit of the day. • given Xc = 0, each cake is of mango of probability 2/3 and of strawberry of probability 1/3. • (Xm|Xc = 0) ~ Bin(4, 2/3) and (Xs|Xc = 0) ~ Bin(4, 1/3). • V(Xm|Xc = 0) = V(Xs|Xc = 0) = 8/9 • the total profit = (Y|Xc = 0) = 2(Xm|Xc = 0) + (Xs|Xc = 0) (Xm|Xc = 0) + (Xs|Xc = 0) = 4 • (Y|Xc = 0) = 4 + (Xm|Xc = 0)  • V(Y) = V(4+Xm|Xc = 0) = V(Xm|Xc = 0) = 8/9 26

  27. Recursive Relationships 27

  28. Two Innocent Equations • A and B : two events • P(A) = P(A|B)P(B) + P(A|Bc)P(Bc) •  generalization: jBj =  and BiBj =  for ij • P(A) = • recursive equations induced by these equations 28

  29. Recursive Relationship • a special property in some random phenomena: changing back to oneself, or to something related • flipping a coin until getting the first head first flip = H THE END flipping a coin until getting the first head flipping a coin until getting the first head + first flip = T one flip 29

  30. Recursive Relationship type 1 outcome simple type 1 problem . . . . random phenomenon type k+1 problem being related to the original random phenomenon type k outcome simple type k problem type k+1 outcome type 1 outcome simple type 1 analysis difficult type k+1 problem . . . . the problem may become easy if the type k+1 problem is related to the original random phenomenon random phenomenon type k outcome simple type k analysis type k+1 outcome 30

  31. More Recursive Relationships type 1 outcome simple type 1 problem The two problems for random phenomena A and B may be solved easily if they evolve into each other. random phenomenon B random phenomenon A type 2 outcome difficult problem related to random phenomenon B difficult problem related to random phenomenon A type 1’ outcome simple type 1’ problem type 2’ outcome 31

  32. About Recursive Relationships • more forms, possibly involving more than 2 random phenomena • identifying the relationships among random phenomena being an art, not necessarily science 32

  33. Exercise 3.1.2 of Notes • n contractors bidding for m projects (n  m) • one project for each contractor • all projects being equally profitable • random independent bids by contractors • Ai = the project i, im is bid (by at least one contractor) • (a). Find   • (b). Find P( A1) • (c). Find • (d). Find P(A2| A1)  33

  34. Exercise 3.1.2 of Notes random bids by n contractors on m projects (n  m), one project for each contractor • Ai = the project i, im is bid (by at least one contractor) • (a). • (b). P( A1) = • (c). • (d). to find P(A2| A1), note that   34

  35. Examples of Ross in Chapter 3 • Examples 3.2, 3.3, 3.4, 3.5, 3.6, 3.7 35

More Related