1 / 77

Expectation

Expectation. Let X denote a discrete random variable with probability function p ( x ) (probability density function f ( x ) if X is continuous ) then the expected value of X, E ( X ) is defined to be:. and if X is continuous with probability density function f ( x ).

rue
Download Presentation

Expectation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Expectation

  2. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected value of X, E(X) is defined to be: and if X is continuouswith probability density function f(x)

  3. Example: Suppose we are observing a seven game series where the teams are evenly matched and the games are independent. Let X denote the length of the series. Find: • The distribution of X. • the expected value of X, E(X).

  4. Solution: Let A denote theevent that team A, wins and B denote the event that team B wins. Then the sample space for this experiment (together with probabilities and values of X)would be (next slide):

  5. continued • At this stage it is recognized that it might be easier to determine the distribution of X using counting techniques

  6. The possible values of X are {4, 5, 6, 7} • The probability of a sequence of length x is (½)x • The series can either be won by A or B. • If the series is of length x and won by one of the teams (A say) then the number of such series is: • In a series of that lasts x games, the winning team wins 4 games and the losing team wins x - 4 games. The winning team has to win the last games. The no. of ways of choosing the games that the losing team wins is:

  7. Thus The probability of a series of length x. The no. of ways of choosing the winning team • The no. of ways of choosing the games that the losing team wins

  8. Interpretation of E(X) • The expected value of X, E(X), is the centre of gravity of the probability distribution of X. • The expected value of X, E(X), is the long-run average value of X. (shown later –Law of Large Numbers) E(X)

  9. Example: The Binomal distribution Let X be a discrete random variable having the Binomial distribution. i. e. X = the number of successes in n independentrepetitions of a Bernoulli trial. Find the expected value of X, E(X).

  10. Solution:

  11. Example: A continuous random variable The Exponential distribution Let X have an exponential distribution with parameter l. This will be the case if: • P[X ≥ 0] = 1, and • P[ x ≤ X ≤ x + dx| X ≥ x] = ldx. The probability density function of X is: The expected value of X is:

  12. using integration by parts. We will determine

  13. Summary: If X has an exponential distribution with parameter lthen:

  14. Example: The Uniform distribution Suppose X has a uniform distribution from a to b. Then: The expected value of X is:

  15. Example: The Normal distribution Suppose X has a Normal distribution with parameters mand s. Then: The expected value of X is: Make the substitution:

  16. Hence Now

  17. Example: The Gamma distribution Suppose X has a Gamma distribution with parameters aand l. Then: Note: This is a very useful formula when working with the Gamma distribution.

  18. The expected value of X is: This is now equal to 1.

  19. Thus if X has a Gamma (a ,l) distribution then the expected value of X is: Special Cases: (a ,l) distribution then the expected value of X is: • Exponential (l) distribution:a = 1, l arbitrary • Chi-square (n) distribution:a = n/2, l = ½.

  20. The Gamma distribution

  21. The Exponential distribution

  22. The Chi-square (c2) distribution

  23. Expectation of functions of Random Variables

  24. Definition Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected value of g(X), E[g(X)] is defined to be: and if X is continuouswith probability density function f(x)

  25. Example: The Uniform distribution Suppose X has a uniform distribution from 0to b. Then: Find the expected value of A = X2. If X is the length of a side of a square (chosen at random form 0 to b) then A is the area of the square = 1/3 the maximum area of the square

  26. Example: The Geometric distribution Suppose X (discrete) has a geometric distribution with parameter p. Then: Find the expected value of XA and the expected value of X2.

  27. Recall: The sum of a geometric Series Differentiating both sides with respect to r we get: Differentiating both sides with respect to r we get:

  28. Thus This formula could also be developed by noting:

  29. This formula can be used to calculate:

  30. To compute the expected value of X2. we need to find a formula for Note Differentiating with respect to r we get

  31. Differentiating again with respect to r we get Thus

  32. implies Thus

  33. Thus

  34. Moments of Random Variables

  35. Definition Let X be a random variable (discrete or continuous), then the kthmoment of X is defined to be: The first moment of X , m = m1 = E(X) is the center of gravity of the distribution of X. The higher moments give different information regarding the distribution of X.

  36. Definition Let X be a random variable (discrete or continuous), then the kthcentralmoment of X is defined to be: wherem = m1 = E(X) = the first moment of X .

  37. The central moments describe how the probability distribution is distributed about the centre of gravity, m. = 2ndcentral moment. depends on the spread of the probability distribution of X about m. is called the variance of X. and is denoted by the symbolvar(X).

  38. is called the standard deviation of X and is denoted by the symbols. The third central moment contains information about the skewness of a distribution.

  39. The third central moment contains information about the skewness of a distribution. Measure of skewness

  40. Positively skewed distribution

  41. Negatively skewed distribution

  42. Symmetric distribution

  43. The fourth central moment Also contains information about the shape of a distribution. The property of shape that is measured by the fourth central moment is called kurtosis The measure ofkurtosis

  44. Mesokurtic distribution

  45. Platykurtic distribution

  46. leptokurtic distribution

  47. Example: The uniform distribution from 0 to 1 Finding the moments

  48. Finding the central moments:

More Related