1 / 40

Multiple Random Variables: CDFs, PDFs, Marginals, Independence, and Functions

This article covers the concepts of multiple random variables, including cumulative distribution functions (CDFs), probability density functions (PDFs), marginals, independence, and functions. It also explores multidimensional expectation, correlation and covariance, and multivariate Gaussian random variables.

tpappas
Download Presentation

Multiple Random Variables: CDFs, PDFs, Marginals, Independence, and Functions

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. EE 5345 Multiple Random Variables • Cdf’s and pdf’s, Marginals, Independence • Functions of Several RV’s • Multidimensional Expectation: Correlation & Covariance • Multivariate Gaussian RV’s.

  2. Multiple Random Variables • Cumulative Distribution Function

  3. Multiple Random Variables (cont) • Probability Density Function • CDF Marginals

  4. Multiple Random Variables (cont) • Pdf Marginals: Integrate out what you don’t want.

  5. Multiple Random Variables (cont) • Conditional Pdf’s

  6. Multiple Random Variables (cont) • Independence • The joint is the product of the marginals.

  7. Multiple Random Variables (cont) • Expectation • Note: If Xk’s are independent (proof)

  8. Multiple Random Variables (cont) • If Xk’s are independent, the joint characteristic function is the product of the marginal characteristic functions

  9. Random Variable Sum (cont) • If X and Y are independent, and Z=X+Y • Thus, from the convolution theorem of Fourier analysis

  10. Random Variable Sum (cont) • If {Xk | 1 k  n } are i.i.d. (independent and identically distributed), and Then…

  11. Random Variable Sum (cont) If {Xk | 1 k  n } are i.i.d. and Then • S is Gaussian if the Xk’s are Gaussian. • S is Poisson if the Xk’s are Poisson. • S is Binomial if the Xk’s are Binomial. • S is Gamma if the Xk’s are Gamma. • S is Cauchy if the Xk’s are Cauchy. • S is Negative Binomial if the Xk’s are Negative Binomial .

  12. Functions of Several Random Variables Types of Transformations • A Single Function of n RV’s • Functions of n RV’s

  13. Leibnitz’s Rule "It is unworthy of excellent men to lose hours like slaves in the labor of calculation, which could be safely relegated to anyone else if machines were used." Gottfried von Leibnitz. Gottfried von Leibnitz (1648-1716)

  14. x2 Rz x1 One Function of Several Random Variables CDF of Z: Challenge: Find the region

  15. x2 Rz x1 One Function of Several Random Variables(cont)

  16. y Rz z x Sum of Two Random Variables(cont)

  17. y Rz z x Sum of Two Random Variables(cont)

  18. Sum of Two Random Variables(cont)

  19. Sum of Two Random Variables(cont) If X and Y are independent: This is the same result we saw with the Characteristic function.

  20. y y = z / x Rz x Product of Two Random Variables Assume X > 0 and Y >0 . Then…

  21. y y = z / x Rz x Product of Two Random Variables X ,Y >0

  22. Product of Two Random Variables X ,Y >0 Use Liebnitz’ Rule What happens when we do not restrict X and Y to be positive?

  23. Product of Two Random Variables: Example X and Y i.i.d. and uniform on (0,1)

  24. 1 z Product of Two Random Variables: Example X and Y i.i.d. and uniform on (0,1)

  25. fX(x) fY(y) x y Quotient of Two Random Variables Scaling Background: If Y=aX a = 2

  26. Quotient of Two Random Variables(cont) Given Y, this is a simple scaling problem with Thus…

  27. Joint pdf… Quotient of Two Random Variables(cont)

  28. Quotient of Two Random Variables(example) X & Y i.i.d. exponential RV’s

  29. Quotient of Two Random Variables(example) Integration

  30. 2. Dumb way. (unless you know the distribution of Z): Set Find Compute Expectation There are two ways to find 1. Smart way

  31. Expectation (example) • X and Y uniform on (0,1) and i.i.d. • Find E[Z] when Z= cos(2(X+Y))

  32. Expectation (discrete) • If X and Y are discrete RV’s, we can use the probability mass function

  33. Expectation (joint moments) • The joint moments of X and Y are • If discrete

  34. Expectation (correlation) • Correlation of X and Y are If correlation=0, X and Y are orthogonal. • Covariance of X and Y is • Correlation coefficient

  35. Joint Gaussian Random Variables What does this look like?

  36. g(x,y)=a f(g(x,y)) = f(a) x Contours If g(x,y) has contours, then f(g(x,y)) has the same contours. y

  37. Joint Gaussian Random Variables Thus has the same contours as This is the equation for an ellipse.

  38. y m2 m1 x Joint Gaussian Random Variables Means (m1 and m2), variances and 1and 2 aand correlation coefficient , uniquely define 2-D Gaussian. • The marginals are 1-D Gaussian RV’s. • Do Gaussian marginals imply a joint Gaussian RV? • When is a joint Gaussian RV a line mass?

  39. n Jointly Gaussian RV’s where and the covariance matrix is

  40. n Jointly Gaussian RV’s The Characteristic Function

More Related