Lecture 11
This presentation is the property of its rightful owner.
Sponsored Links
1 / 17

Lecture 11 PowerPoint PPT Presentation


  • 108 Views
  • Uploaded on
  • Presentation posted in: General

Lecture 11. Joint Probability, Distributions and Random Variables. Jointly Distributed RVs In many situations, more than one random variable will be of interest to the experimenter. For Discrete Variables: The pmf of two discrete random variables:.

Download Presentation

Lecture 11

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Lecture 11

Lecture 11

Joint Probability, Distributions and Random Variables


Lecture 11

  • Jointly Distributed RVs

    In many situations, more than one random variable will be of interest to the experimenter.

  • For Discrete Variables:

    • The pmf of two discrete random variables:

  • The marginal pmfs of X p1(x1), and Y p2(x2) are:


Lecture 11

Consider two production lines that manufacture a certain item. The production rates for both lines vary randomly from day to day. Line 1 has a capacity of 4 units per day while line II has a capacity of 3 units per day. Further, both lines produce at least one unit on any given day. Let X1 = No. of units produced by line I, and X2 = No. of units produced by line II per day. The joint probability (pr) distribution (JPD) of X1 and X2 is tabulated as follows:

  • Example

The above table implies that the joint pr P(X1 = 2, X2 = 3) = p(2, 3) = 0.10, and p(4, 2) = 0.15, etc. Further, p1(x1) and p2(x2) are referred to as the marginal pr distributions (mpds) of X1 and X2, respectively.


Lecture 11

Further,

1 = E (X1) = 0.10 + 0.50 + 1.05 + 1.20 = 2.85 units/day, and

2 = E (X2) = 0.20 + 0.90 +1.05 = 2.15 units/day.

Similarly,

E (X12) = 9.05 12 = 11 = 0.9275 1 = 0.9631

E(X22) = 5.15 22 = 22 = 0.5275 2 = 0.7263.

The covariance between 2 random variables (rvs) is defined as:

12 = COV(X1, X2) = E [(X1 - 1)(X2 - 2)] = E (X1 X2) 1 2.

For the above example,

E(X1 X2) = 0.01 + 20.05 + 30.04 + 20.05 + 40.10 + 60.10 + 30.10 + 60.15 + 90.10 + 40.04 + 80.15 + 120.11 = 6.11

12 = 6.11  (2.85)·(2.15) =  0.0175

Hence the covariance matrix is:

Note that the covariance matrix  is always symmetrical because ij = ji for all i  j. Further, covariance must be taken only two rvs at a time (not 3).


Lecture 11

  • The correlation coefficient between X1 and X2 , , is:

  • It can be shown that 1 +1, where  = 0 implies no correlation between X1 and X2( = 0 does not always imply that X1 and X2 are independent).

  • A value of  =  1 implies perfect correlation between X1 and X2.

  • A positive 0 <  1 implies that the relationship between x1 and x2 is increasing and vice a versa when  1  < 0.

  • For example, there is a positive correlation between X1 = the amount of irrigation, and X2 = crop yield. While, there is a negative correlation between X1 = width of road, and X2 = accident rate.


Lecture 11

  • Conditional Probability Distributions – Discrete rvs

    The conditional probability of X2 given X1=x1 is:

  • For our example,


Lecture 11

  • Conditional Expectations

    (Remember how the expected value of any discrete random variable is calculated)

  • For our example

  • E(X2|X1=1) = (0.10)·(1) + (0.50)·(2) + (0.40)·(3) = 2.30

  • E(X1|X2=3) = (4/35)·(1) + (10/35)·(2) + (10/35)·(3) + (11/35)·(4) = 2.80


Lecture 11

  • Continuous Bivariate Random Variables

    Example:

    Suppose X1 represents surface tension and X2 represents the acidity of the same sampling unit of a chemical product. The joint probability density function (pdf) of the random variables X1 and X2 is:

f(x1, x2) = C (6 – x1 x2), 0 x1  2, 2 x2  4

Compute the value of C if 0X12, and 2X24 :

(The volume under f(x1,x2) should be equal to zero, ONE )


Lecture 11

  • f(x1,x2) = 0.125(6  x1  x2) is a jpdf over RX: 0X12, 2X24, because the volume under f(x1,x2) is equal to 1.00

  • Compute the joint pr that a randomly selected unit has a surface tension less than 1 and an acidity not exceeding 3.

    • i.e. P(0X11 and 2X23) = ?

P(0X11 and 2X23) = 0.375


Lecture 11

  • Compute P(X1+X24) or P(X24X1)

You can use the MATLAB commands below:

syms x1 x2

double(int(int(1/8*(6-x1-x2),x2,2,4-x1),x1,0,2)) = 0.6667

double(int(int(1/8*(6-x1-x2),x1,0,4-x2),x2,2,4)) = 0.6667


Lecture 11

  • Marginal Probability Density Functions – Continuous rvs


Lecture 11

  • Conditional probability density functions – Continuous rvs

    The conditional pdf of X2 given x1 is defined as:

  • Conditional Expectation:


Lecture 11

  • Independent Random Variables

    Two random variables X1 and X2 are independent if for every pair of x1,x2 values,

    p(x1,x2) = p1(x1)·p2(x2), when X1 and X2 are discrete

    f(x1,x2) = f1(x1)·f2(x2), when X1 and X2 are continuous.

    If the conditions above are not satisfied for all (x1,x2) then the rvs X1 and X2 are said to be dependent.

    The two random variables are independent if the joint pmf or pdf is the product of the two marginal pmf’s or pdf’s.

    If X1 and X2 are independent, then E(X2|x1) should not be a function of X1 over the range of X1, Rx1, and vice versa.


Lecture 11

  • For our discrete example,

    p(2,3) = 0.10 =? p1(2)·p2(3)

    p1(2)·p2(3) = (0.25)·(0.35) = 0.0875

    0.0875  0.10 So X1 and X2 are NOT independent.

  • For our continuous example,

Note that since X2 is not independent of X1, then the E(X2x1) is a function of x1 over the range space Rx1 = [0, 2].


Lecture 11

  • Functions of two random variables

Example 5.12 on page 219

The expected number of seats separating any particular 2 of the 5 people.

Let X1=the seat number of the 1st, X2=the seat number of the 2nd,

Possible (x1,x2) pairs are: {(1,2), (1,3), (1,4),….,(4,5)} – A total of 20 different possible ways the two might have gotten their tickets.

Let h(x1,x2) be the number of seats separating the two. h(x1,x2)=|X1-X2|-1


Lecture 11

  • Covariance

    If two rvs are not independent, a measure to assess how strongly they are related is often needed.

    The covariance between two rvs X1, and X2 is:

A simplified formula is:

Note that


Lecture 11

  • Correlation Coefficient

    It can be seen that the covariance is a unit dependent measure. Cov(100X1,100X2)=(100)(100)Cov(X1,X2), therefore a unitless measure is more practical.

    The correlation coefficient of rvs X1, and X2 is:

Note that

and

If X1 and X2 are independent, then =0, but =0 does not imply independence!!

=1 or -1 iff X2=a·X1+b, for some numbers a and b with a  0.


  • Login