Section 9 – Functions and Transformations of Random Variables

1 / 7

# Section 9 – Functions and Transformations of Random Variables - PowerPoint PPT Presentation

Section 9 – Functions and Transformations of Random Variables. Distribution of a transformation of continuous RV: X. Y = u(X) Y is defined as a function “u” of X v(u(x))=x Function “v” is defined as function “u”’s inverse function Obtain v(u(x)) by solving the given Y=u(x) for x.

I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.

## PowerPoint Slideshow about 'Section 9 – Functions and Transformations of Random Variables' - jenibelle

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

### Section 9 – Functions and Transformations of Random Variables

Distribution of a transformation of continuous RV: X
• Y = u(X)
• Y is defined as a function “u” of X
• v(u(x))=x
• Function “v” is defined as function “u”’s inverse function
• Obtain v(u(x)) by solving the given Y=u(x) for x
Distribution of a Sum of RV’s
• If Y = X1 + X2…
• E[Y] = E[X1] + E[X2]
• Var[Y] = Var[X1] + Var[X2] + 2Cov[X1,X2]
• Use the convolution method to find the distribution of sum of independent RV’s
• Note: X1 & X2 must be independent
Central Limit Theorem
• If asked for a probability involving a sum of a large number of independent RV’s, you usually need to use the normal approximation
• X1, X2, … Xn are indep. RV’s with the same distribution
• X has mean and standard deviation
Distribution of Maximum or Minimum of a Collection of Independent RV’s
• Suppose X1 and X2 are independent RV’s
• U = max{X1, X2}
• V = min{X1, X2} (trickier)
• We know F1(x) and F2(x) such that F1(x)=P(X1<=x)
Mixtures of RV’s
• X1 and X2 are RV’s with independent (but usually different) probability functions
• X is a mixture of X1 and X2
• Where 0
• Weight “a” on X1
• Weight (1-a) on X2
• Expectation, probabilities, and moments follow a “weighted-average” form:
• TRAP: Variance is NOT calculated from a “weighted-average approach”
• Use the above approach to find E[X] & E[X^2]
Mixtures of RV’s
• There is a difference between a “sum of RV’s” and a “mixture of RV’s”
• Sum of RV’s: X = X1 + X2
• Mixture of RV’s: X is defined by probability function:
• So, don’t make the mistake of thinking:

X = a*X1 + (1-a)*X2 (this is wrong)