1 / 32

Two Functions of Two Random Variables

Two Functions of Two Random Variables. Two Functions of Two RV.s. X and Y are two random variables with joint p.d.f and are functions define the new random variables: How to determine the joint p.d.f

Download Presentation

Two Functions of Two Random Variables

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Two Functions of Two Random Variables

  2. Two Functions of Two RV.s • X and Y are two random variables with joint p.d.f • and are functions • define the new random variables: • How to determine the joint p.d.f • with in hand, the marginal p.d.fs and can be easily determined.

  3. Two Functions of Two RV.s • For given z and w, • where is the region in the xy plane such that :

  4. Example • X and Y are independent uniformly distributed rv.s in • Define • Determine • Obviously both w and z vary in the interval Thus • two cases: Solution

  5. Example - continued • For • For • With • we obtain • Thus

  6. Example - continued • Also, • and • and are continuous and differentiable functions, • So, it is possible to develop a formula to obtain the joint p.d.f directly.

  7. (b) (a) • Let’s consider: • Let us say these are the solutions to the above equations:

  8. Consider the problem of evaluating the probability This can be rewritten as: To translate this probability in terms of we need to evaluate the equivalent region for in the xy plane.

  9. (a) (b) • The point A with coordinates (z,w) gets mapped onto the point with coordinates • As z changes to to point B in figure (a), • let represent its image in the xy plane. • As w changes to to C, • let represent its image in the xy plane.

  10. Finally D goes to • represents the equivalent parallelogram in the XY plane with area • The desired probability can be alternatively expressed as • Equating these, we obtain • To simplify this, we need to evaluate the area of the parallelograms in terms of

  11. let and denote the inverse transformations, so that • As the point (z,w) goes to • the point • the point • the point • Hence x and y of are given by: • Similarly, those of are given by:

  12. The area of the parallelogram is given by • From the figure and these equations, • so that • or

  13. This is the Jacobian of the transformation • Using these in , and we get • where represents the Jacobian of the original transformation: (*)

  14. Example • Example 9.2: Suppose X and Y are zero mean independent Gaussian r.vs with common variance • Define where • Obtain • Here • Since • if is a solution pair so is Thus Solution

  15. Example - continued • Substituting this into z, we get • and • Thus there are two solution sets • so that

  16. Example - continued • Also is • Notice that here also • Using (*), • Thus • which represents a Rayleigh r.v with parameter

  17. Example - continued • Also, • which represents a uniform r.v in • Moreover, • So Z and W are independent. • To summarize, If X and Y are zero mean independent Gaussian random variables with common variance, then • has a Rayleigh distribution • has a uniform distribution. • These two derived r.vs are statistically independent.

  18. Example - continued • Alternatively, with X and Y as independent zero mean Gaussian r.vs with common variance, X + jY represents a complex Gaussian r.v. But • where Z and W are as in except that for the abovementioned, to hold good on the entire complex plane we must have • The magnitude and phase of a complex Gaussian r.v are independent with Rayleigh and uniform distributions respectively.

  19. Example • Let X and Y be independent exponential random variables with common parameter . • Define U = X + Y, V = X - Y. • Find the joint and marginal p.d.f of U and V. • It is given that • Now since u= x+ y, v= x- y, always and there is only one solution given by Solution

  20. Example - continued • Moreover the Jacobian of the transformation is given by • and hence represents the joint p.d.f of U and V. This gives • and • Notice that in this case the r.vs U and V are not independent.

  21. As we will show, the general transformation formula in (*) making use of two functions can be made useful even when only one function is specified.

  22. Auxiliary Variables • Suppose • X and Y : two random variables. • To determine by making use of the above formulation in (*), we can define an auxiliary variable • and the p.d.f of Z can be obtained from by proper integration.

  23. Example • Z = X + Y • Let W = Y so that the transformation is one-to-one and the solution is given by • The Jacobian of the transformation is given by • and hence • or • This reduces to the convolution of and if X and Y are independent random variables.

  24. Example • Let and be independent. • Define • Find the density function of Z. • Making use of the auxiliary variable W = Y, Solution

  25. Example - continued • Using these in (*), we obtain • and • Let so that • Notice that as w varies from 0 to 1, u varies from to

  26. Example - continued • Using this in the previous formula, we get • As you can see, • A practical procedure to generate Gaussian random variables is from two independent uniformly distributed random sequences, based on

  27. Example • Let Xand Y be independent identically distributed Geometric random variables with • (a) Show that min (X , Y ) and X – Y are independent random variables. • (b) Show that min (X , Y ) and max (X, Y ) –min (X, Y ) are also independent • (a) Let Z= min (X, Y ) , and W= X – Y. • Note that Z takes only nonnegative values while Wtakes both positive, zero and negative values Solution

  28. Example - continued • We have P(Z= m, W = n) = P{min (X, Y ) = m, X – Y= n}. • But • Thus

  29. Example - continued represents the joint probability mass function of the random variables Z and W. Also

  30. Example - continued • Thus Z represents a Geometric random variable since and • Note that establishing the independence of the random variables Z and W. • The independence of X – Yand min (X , Y ) when X and Yare • independent Geometric random variables is an interesting observation.

  31. Example - continued • (b) Let Z = min (X , Y ) , R = max (X , Y ) – min (X , Y ). • In this case both Zand R take nonnegative integer values • we get

  32. Example - continued • This equation is the joint probability mass function ofZandR. • Also we can obtain: • and • From (9-68)-(9-70), we get • This proves the independence of the random variables Z and R as well.

More Related