section 9 functions and transformations of random variables
Download
Skip this Video
Download Presentation
Section 9 – Functions and Transformations of Random Variables

Loading in 2 Seconds...

play fullscreen
1 / 7

Section 9 – Functions and Transformations of Random Variables - PowerPoint PPT Presentation


  • 100 Views
  • Uploaded on

Section 9 – Functions and Transformations of Random Variables. Distribution of a transformation of continuous RV: X. Y = u(X) Y is defined as a function “u” of X v(u(x))=x Function “v” is defined as function “u”’s inverse function Obtain v(u(x)) by solving the given Y=u(x) for x.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Section 9 – Functions and Transformations of Random Variables' - jenibelle


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
distribution of a transformation of continuous rv x
Distribution of a transformation of continuous RV: X
  • Y = u(X)
    • Y is defined as a function “u” of X
  • v(u(x))=x
    • Function “v” is defined as function “u”’s inverse function
    • Obtain v(u(x)) by solving the given Y=u(x) for x
distribution of a sum of rv s
Distribution of a Sum of RV’s
  • If Y = X1 + X2…
    • E[Y] = E[X1] + E[X2]
    • Var[Y] = Var[X1] + Var[X2] + 2Cov[X1,X2]
  • Use the convolution method to find the distribution of sum of independent RV’s
    • Note: X1 & X2 must be independent
central limit theorem
Central Limit Theorem
  • If asked for a probability involving a sum of a large number of independent RV’s, you usually need to use the normal approximation
    • X1, X2, … Xn are indep. RV’s with the same distribution
      • X has mean and standard deviation
distribution of maximum or minimum of a collection of independent rv s
Distribution of Maximum or Minimum of a Collection of Independent RV’s
  • Suppose X1 and X2 are independent RV’s
    • U = max{X1, X2}
    • V = min{X1, X2} (trickier)
    • We know F1(x) and F2(x) such that F1(x)=P(X1<=x)
mixtures of rv s
Mixtures of RV’s
  • X1 and X2 are RV’s with independent (but usually different) probability functions
  • X is a mixture of X1 and X2
    • Where 0
    • Weight “a” on X1
    • Weight (1-a) on X2
  • Expectation, probabilities, and moments follow a “weighted-average” form:
  • TRAP: Variance is NOT calculated from a “weighted-average approach”
  • Use the above approach to find E[X] & E[X^2]
mixtures of rv s1
Mixtures of RV’s
  • There is a difference between a “sum of RV’s” and a “mixture of RV’s”
    • Sum of RV’s: X = X1 + X2
    • Mixture of RV’s: X is defined by probability function:
  • So, don’t make the mistake of thinking:

X = a*X1 + (1-a)*X2 (this is wrong)

ad