1 / 53

Marginal and Conditional distributions

Marginal and Conditional distributions. Theorem: (Marginal distributions for the Multivariate Normal distribution). have p-variate Normal distribution. with mean vector. and Covariance matrix.

oriana
Download Presentation

Marginal and Conditional distributions

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Marginal and Conditional distributions

  2. Theorem: (Marginal distributions for the Multivariate Normal distribution) have p-variate Normal distribution with mean vector and Covariance matrix Then the marginal distribution of is qi-variate Normal distribution (q1 = q, q2 = p - q) with mean vector and Covariance matrix

  3. Theorem: (Conditional distributions for the Multivariate Normal distribution) have p-variate Normal distribution with mean vector and Covariance matrix Then the conditional distribution of given is qi-variate Normal distribution with mean vector and Covariance matrix

  4. is called the matrix of partial variances and covariances. is called the partial covariance (variance if i = j) between xi and xj given x1, … , xq. is called the partial correlation between xi and xj given x1, … , xq.

  5. is called the matrix of regression coefficients for predicting xq+1, xq+2,… , xpfrom x1, … , xq. Mean vector of xq+1, xq+2,… , xpgiven x1, … , xqis:

  6. Example: Suppose that Is 4-variate normal with

  7. The marginal distribution of is bivariate normal with The marginal distribution of is trivariate normal with

  8. Find the conditional distribution of given Now and

  9. The matrix of regression coefficients for predicting x3, x4from x1, x2.

  10. Thus the conditional distribution of given is bivariate Normal with mean vector And partial covariance matrix

  11. Using SPSS Note: The use of another statistical package such as Minitab is similar to using SPSS

  12. The first step is to input the data. The data is usually contained in some type of file. • Text files • Excel files • Other types of files

  13. After starting the SSPS program the following dialogue box appears:

  14. If you select Opening an existing file and press OK the following dialogue box appears

  15. Once you selected the file and its type

  16. The following dialogue box appears:

  17. If the variable names are in the file ask it to read the names. If you do not specify the Range the program will identify the Range: Once you “click OK”, two windows will appear

  18. A window containing the output

  19. The other containing the data:

  20. To perform any statistical Analysis select the Analyze menu:

  21. To compute correlations select Correlate then BivariateTo compute partial correlations select Correlate then Partial

  22. for Bivariate correlation the following dialogue appears

  23. the output for Bivariate correlation:

  24. for partial correlation the following dialogue appears

  25. the output for partial correlation: - - - P A R T I A L C O R R E L A T I O N C O E F F I C I E N T S - - - Controlling for.. AGE HT WT CHL ALB CA UA CHL 1.0000 .1299 .2957 .2338 ( 0) ( 178) ( 178) ( 178) P= . P= .082 P= .000 P= .002 ALB .1299 1.0000 .4778 .1226 ( 178) ( 0) ( 178) ( 178) P= .082 P= . P= .000 P= .101 CA .2957 .4778 1.0000 .1737 ( 178) ( 178) ( 0) ( 178) P= .000 P= .000 P= . P= .020 UA .2338 .1226 .1737 1.0000 ( 178) ( 178) ( 178) ( 0) P= .002 P= .101 P= .020 P= . (Coefficient / (D.F.) / 2-tailed Significance) " . " is printed if a coefficient cannot be computed

  26. Compare these with the bivariate correlation:

  27. Partial Correlations CHL ALB CA UA CHL 1.0000 .1299 .2957 .2338 ALB .1299 1.0000 .4778 .1226 CA .2957 .4778 1.0000 .1737 UA .2338 .1226 .1737 1.0000 Bivariate Correlations

  28. In the last example the bivariate and partial correlations were roughly in agreement. This is not necessarily the case in all stuations An Example: The following data was collected on the following three variables: • Age • Calcium Intake in diet (CAI) • Bone Mass density (BMI)

  29. The data

  30. Bivariate correlations

  31. Partial correlations

  32. Scatter plot CAI vs BMI (r = -0.447)

  33. 65 25 75 55 45 35

  34. 3D Plot Age, CAI and BMI

  35. Theorem Let x1, x2,…, xn denote random variables with joint probability density function f(x1, x2,…, xn ) Let u1= h1(x1, x2,…, xn). Transformations u2= h2(x1, x2,…, xn).  un= hn(x1, x2,…, xn). define an invertible transformation from the x’s to the u’s

  36. Then the joint probability density function of u1, u2,…, un is given by: where Jacobian of the transformation

  37. Suppose that x1, x2 are independent with density functions f1 (x1) and f2(x2) Find the distribution of u1= x1+ x2 Example u2= x1 - x2 Solving for x1 and x2 we get the inverse transformation

  38. The Jacobian of the transformation

  39. The joint density of x1, x2 is f(x1, x2) = f1 (x1) f2(x2) Hence the joint density of u1and u2 is:

  40. Theorem Let x1, x2,…, xn denote random variables with joint probability density function f(x1, x2,…, xn ) Let u1= a11x1+ a12x2+…+ a1nxn + c1 u2= a21x1 + a22x2+…+ a2nxn + c2  un= an1x1+ an2x2 +…+ annxn + cn define an invertible linear transformation from the x’s to the u’s

  41. Then the joint probability density function of u1, u2,…, un is given by: where

  42. Theorem Suppose that The random vector, [x1, x2, … xp]has a p-variate normal distribution with mean vector and covariance matrix S then has a p-variate normal distribution with mean vector and covariance matrix

  43. Theorem Suppose that The random vector, [x1, x2, … xp]has a p-variate normal distribution with mean vector and covariance matrix S then has a p-variate normal distribution with mean vector and covariance matrix

  44. Proof then

  45. since and Also and hence QED

More Related