1 / 25

Chapter 2 Multivariate Distributions

Chapter 2 Multivariate Distributions. Math 6203 Fall 2009 Instructor: Ayona Chatterjee. Random Vector.

roscoe
Download Presentation

Chapter 2 Multivariate Distributions

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Chapter 2Multivariate Distributions Math 6203 Fall 2009 Instructor: AyonaChatterjee

  2. Random Vector • Given a random experiment with a sample space C. Consider two random variables X1 and X2 which assign to each element c of C one and only one ordered pair of numbers X1(c)=x1 and X2(c)=x2. Then we say that (X1, X2) is a random vector. • The space of (X1, X2) is the set of ordered pairs D={(x1, x2) : X1(c)=x1 and X2(c)=x2 }

  3. Cumulative Distribution Function • The joint cumulative distribution function of (X1, X2) is denoted by FX1,X2(x1, x2) and is given as FX1,X2 (x1, x2) =P[X1≤x1, X2 ≤x2)]. • A random vector (X1, X2 )is a discrete random variable is its space D is finite or countable. • A random vector (X1, X2 ) with space D is continuous if its cdf FX1,X2 (x1, x2) is continuous.

  4. Probability Mass Function • For discrete random variables X1 and X2, the joint pmf is defined as

  5. Probability Density Function • For a continuous random vector

  6. Marginals • The marginal distributions can be obtained from the joint probability density function. • For a discrete and continuous random vector the marginals can be obtained as below:

  7. Expectation • Suppose (X1, X2) is of the continuous type. Then E(Y) exists if

  8. Theorem • Let (X1, X2) be a random vector. Let Y1 = g1(X1, X2) and Y2 = g2 (X1, X2) be a random variable whose expectations exits. Then for any real numbers k1 and k2. E(k1 Y1 + k2 Y2 )= k1E(Y1 ) + k2 E(Y2 )

  9. Note

  10. Moment Generating Function • Let X = (X1. X2 )’ be a random vector. If E(et1x1+t2x2 ) exists for |t1 |<h1 and |t2 |<h2 where h1 and h2 are positive, the mgf is given as

  11. 2.3 CONDITIONAL DISTRIBUTIONS AND EXPECTATIONS • So far we know • How to find marginals given the joint distribution. • Now • Look at conditional distribution, distribution of one of the random variable when the other has a specific value.

  12. Conditional pmf • We define • SX2 is the support of X2. • Here we assume pX1 (x1) > 0. • Thus conditional probability is the joint divvied by the marginal.

  13. Conditional pdf • Let fX1x2 (x1, x2 ) be the joint pdf and fx1 (x1) and fx2 (x2) be the marginals for X1 and X2 respectively then the conditional pdf of X2, given X1 is

  14. Note

  15. Conditional Expectation and Variance

  16. Theorem • Let (X1, X2) be a random vector such that the variance of X2 is finite. Then • E[E(X2 |X1)]=E(X2) • Var[E(X2 |X1 )]≤ var(X2 )

  17. 2.4 The Correlation Coefficient Here ρ is called the correlation coefficient of X and Y. Cov(X,Y) is the covariance between X and Y.

  18. The Correlation Coefficient • Note that -1 ≤ ρ≤ 1. • For the bivariate case • If ρ = 1, the graph of the line y = a + bx (b > 0) contains all the probability of the distribution of X and Y. • For ρ = -1, the above is true for the line y = a + bx with b < 0. • For the non-extreme case, ρ can be looked as a measure of the intensity of the concentration of the probability of X and Y about a line y = a + bx.

  19. Theorem • Suppose (X,Y) have a joint distribution with the variance of X and Y finite and positive. Denote the means and variances of X and Y by µ1 , µ2 and σ12 , σ22 respectively, and let ρ be the correlation coefficient between X and Y. If E(Y|X) is linear in X then

  20. 2.5 Independent random Variables • If the conditional pdf f2|1 (x2|x1) does not depend upon x1 then the marginal pdf of X2 equals the conditional pdf f2|1 (x2|x1) . • Let the random variables X and Y have joint pdf f(x,y) and the marginals fx (x) and fy (y) respectively. The random variables X and Y are said to be independent if and only if • f(x,y)= fx (x) fy (y) • Similar defintion can be wriiten for discrete random variables. • Random variables that are not independent are said to be dependent.

  21. Theorem • Let the random variables X and Y have support S1 and S2, respectively and have the joint pdf f(x,y). Then X and Y are independent if and only if f(x,y) can be written as a product of a nonnegative function of x and a nonnegative function of y. That is f(x,y)=g(x)h(y) where g(x)>0 and h(y)>0.

  22. Note • In general X and Y must be dependent of the space of positive probability density of X and Y is bounded by a curve that is neither a horizontal or vertical line. • Example; f(x,y)=8xy, 0< x< y < 1 • S={(x,y): 0< x< y < 1} This is not a product space.

  23. Theorems • Let (X, Y) have the joint cfd F(x,y) and let A and Y have the marginal cdfsFx(x) and Fy(y) respectively. Then X and Y are independent if and only if • F(x,y)= Fx(x)Fy(y) • The random variable X and Y are independent if and only if the following condition holds. • P(a < X≤ b, c < Y ≤ d)= P(a < X≤ b)P( c < Y ≤ d) • For ever a < b, c < d and a,b,c and are constants.

  24. Theorems • Suppose X and Y are independent and that E(u(X)) and E(v(Y)) exist, then • E[u(x), v(Y)]=E[u(X)]E[v(Y)] • Suppose the joint mgf M(t1,t2) exists for the random variables X and Y. Then X and Y are independent if and only if • M(t1,t2) = M(t1,0)M(0,t2) • That is the joint mfg if the product of the marginal mgfs.

  25. Note • If X and Y are independent then the correlation coefficient is zero. • However a zero correlation coefficient does not imply independence.

More Related