1 / 7

Section 9 – Functions and Transformations of Random Variables

Section 9 – Functions and Transformations of Random Variables. Distribution of a transformation of continuous RV: X. Y = u(X) Y is defined as a function “u” of X v(u(x))=x Function “v” is defined as function the inverse function of “u” Obtain v(u(x)) by solving the given Y=u(x) for x.

audra
Download Presentation

Section 9 – Functions and Transformations of Random Variables

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Section 9 – Functions and Transformations of Random Variables

  2. Distribution of a transformation of continuous RV: X • Y = u(X) • Y is defined as a function “u” of X • v(u(x))=x • Function “v” is defined as function the inverse function of “u” • Obtain v(u(x)) by solving the given Y=u(x) for x

  3. Distribution of a Sum of RV’s • If Y = X1 + X2 • E[Y] = E[X1] + E[X2] • Var[Y] = Var[X1] + Var[X2] + 2Cov[X1, X2] • Use the convolution method to find the distribution of a sum of independent RV’s • Note: X1 & X2must be independent

  4. Central Limit Theorem • If asked for a probability involving a sum of a large number of independent RV’s, you usually need to use the normal approximation • X1, X2, … Xn are independent RV’s with the same distribution • X has mean and standard deviation

  5. Distribution of Maximum or Minimum of a Collection of Independent RV’s • Suppose X1 and X2 are independent RV’s • U = max{X1, X2} • V = min{X1, X2} (trickier) • We know F1(x) and F2(x) such that F1(x)=P(X1<=x)

  6. Mixtures of RV’s • X1 and X2 are RV’s with independent (but usually different) probability functions • X is a mixture of X1 and X2 • Where 0<a<1 • Weight “a” on X1 • Weight (1-a) on X2 • Expectation, probabilities, and moments follow a “weighted-average” form: • Variance is NOT calculated from a “weighted-average approach” • Use the above approach to find E[X] & E[X^2]

  7. Mixtures of RV’s • There is a difference between a “sum of RV’s” and a “mixture of RV’s” • Sum of RV’s: X = X1 + X2 • Mixture of RV’s: X is defined by probability function: • So, don’t make the mistake of thinking: X = a*X1 + (1-a)*X2(this is wrong)

More Related