1 / 10

M ixed Moments C orrelation & Independence, I ndependent sum

M ixed Moments C orrelation & Independence, I ndependent sum. Tutorial 7, STAT1301 Fall 2010, 09NOV2010, MB103@HKU By Joseph Dong. Recall: Univariate Moment of order and Generalization: Mixed Moments. The moment of a random variable is defined as

lefty
Download Presentation

M ixed Moments C orrelation & Independence, I ndependent sum

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. MixedMomentsCorrelation & Independence,Independent sum Tutorial 7, STAT1301 Fall 2010, 09NOV2010, MB103@HKUBy Joseph Dong

  2. Recall: Univariate Moment of order and Generalization: Mixed Moments • The moment of a random variable is defined as • The central moment of a random variable is defined as • Q: How to generalize these two definitions to the case of a random vector of dimensions? • A: We can define the mixed moment of an -dimensional random vector. • Define the mixed momentof a random vector as • Define the mixed centralmomentof a random vector as

  3. Bivariate Mixed Moment: of • The mixed moment of order of and is defined by • The mixed central moment of order of and is defined by • The covariance of two random variables is defined as the 2nd bivariate mixed central moment : • Covariance is a bivariate concept. • A convenient identity: • Properties of : • Symmetry: • Positive semi-definiteness : • Linearity:

  4. Standardization in Statistics • Express position of a number label using its distance from the expectation, in terms of a multiple of the standard deviation. • This is as if we recoordinatize the state space using the location of expectation as the origin and using the standard deviation as the unit length. • Standardization of a random variable is a one-one transformation (a centralization plus a rescaling) of the random variable. • Using angle brackets to denote the resultant random variable of standardization: • Purpose of standardization: for ease of describing positions. For example, • What’s the relative position of the number label 5.3 in the state space of a random variable following . . Since follows (show this if you are not convinced), and 3.25 is a very high quantile. Therefore 5.3 is located quite unusually right in the original state space.

  5. Correlation as standardized Covariance • Covariance is a bivariate concept, so is correlation. • Compare: • Covariance of & : • Quick Question: What if and are independent? • Correlation () of & : • Very interestingly, correlation of any pair of r.v.’s is always bounded, while their covariance can explode. • Pf.

  6. Exploring Correlation demonstration.wolfram.com • Download Mathematica™ player if you don’t have one. • Search “correlation” • And explore…

  7. Covariance/Correlation calculation Exercises • Handout Problem 1 • Handout Problem 2 • Handout Problem 3 • Find the correlation of the two random variables and who are dependent functionally as • . What about ?

  8. Independent sum • Independent sum refers to the sum of independent random variables. • is a random variable itself—it has a sample space, a state space, and a probability measure (and distribution) on the sample space. • We’re now interested in finding the following moments of : • Expectation (too easy and no need independent actually) • Just summing the expecations • Variance (a bit proof work required, uses all ) • It turns out that this is also just the sum of the variances. • Proof uses properties of covariance. • MGF (now easy because we have proved Theorem C of Tutorial 6). • Finding the MGF is equivalent to finding the Distribution. • If we consider a pair of independent sums, we are also interested in finding their covariance (this is easy too) • Using properties of covariance

  9. Independent sum: finding its distribution • Previous slide gives one method for finding the distribution of : through its moment generating function since under independence condition, the moment generating function is very convenient to derive. But there is one problem: what if you don’t recognize the resulting form of MGF? (Assuming we are blinded about the divinely clever integral transformation methods.) • We can also find the distribution of by working with the probability measure directly.

  10. Exercises: Handout Problems 4,5,6

More Related