1 / 8

C9: Joint Distributions and Independence

MATH 3033 based on Dekking et al. A Modern Introduction to Probability and Statistics. 2007 Slides by Michael Maurizi Format by Tim Birbeck Instructor Longin Jan Latecki. C9: Joint Distributions and Independence. 9.1 – Joint Distributions of Discrete Random Variables.

jeanninej
Download Presentation

C9: Joint Distributions and Independence

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. MATH 3033 based onDekking et al. A Modern Introduction to Probability and Statistics. 2007Slides by Michael MauriziFormat by Tim Birbeck Instructor Longin Jan Latecki C9: Joint Distributions and Independence

  2. 9.1 – Joint Distributions of Discrete Random Variables • Joint Distribution: the combined distribution of two or more random variables defined on the same sample space Ω • Joint Distribution of two discrete random variables:The joint distribution of two discrete random variables X and Y can be obtained by using the probabilities of all possible values of the pair (X,Y) Joint Probability Mass function p of two discrete random variables X and Y: Joint Distribution function F of two random variables X and Y: Can be thought of as the sum of the elements in box it makes with the upper-left corner.

  3. 9.1 – Joint Distributions of Discrete Random Variables • Marginal Distribution: Obtained by adding up the rows or columns of a joint probability mass function table. Literally written in the margins. Example: Joint Distribution of S and M. S = The sum of two dice, M = The maximum of two dice. Marginal distribution function of X: Marginal distribution function of Y:

  4. 9.2 – Joint Distributions of Continuous Random Variables • Joint Continuous Distribution: Like an ordinary continuous random variable, only works for a range of values. There must exist a function f that fulfills the following properties for there to be a joint continuous distribution:

  5. 9.2 – Joint Distributions of Continuous Random Variables Joint distribution function: F(a,b) can be constructed given f(x,y), and vice versa Marginal probability density function: You need to integrate out the unwanted random variable to get the marginal distribution.

  6. 9.3 – More than Two Random Variables Assuming we have n random variables X1, X2, X3, … Xn. We can get the joint distribution function and the joint probability mass functions.

  7. 9.4 – Independent Random Variables Tests for Independence: Two random variables X and Y are independent if and only if every event involving X is independent of every event involving Y. This also applies to joint distributions using more than two random variables.

  8. 9.5 – Propagation of Independence Independence after a change of variable: If a function is applied to several independent random variables, the new resulting random variables will also be independent.

More Related