1 / 24

Statistics and Mathematics for Economics

Statistics and Mathematics for Economics. Statistics Component: Lecture Two. Objectives of the Lecture. To provide definitions of a random variable To show different ways of presenting a probability distribution

Download Presentation

Statistics and Mathematics for Economics

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Statistics and Mathematics for Economics Statistics Component: Lecture Two

  2. Objectives of the Lecture • To provide definitions of a random variable • To show different ways of presenting a probability distribution • To explain how a marginal probability distribution can be derived from a joint probability distribution • To inform you of measures which can be used to summarise a probability distribution • To state rules relating to the expectations operator

  3. Definitions of a Random Variable • A variable, the value of which is not known with certainty • A variable, the value of which is the outcome of an experiment • A variable, the value of which is underpinned by a probability distribution • A probability distribution indicates the possible values of a variable, together with the associated probabilities of occurrence

  4. Example of a Random Variable X = the number which is obtained following a single throw of a dice Possible values of X are: 1; 2; 3; 4; 5; 6. Using the classical approach, the probability of each of these six values is 1/6.

  5. Mathematical Presentation of the Probability Distribution of X P(X = x) = 1/6, x = 1, 2, 3, 4, 5, 6.

  6. Diagrammatic Presentation of the Probability Distribution of X P(X) 1/6 0 1 2 3 4 5 6 X

  7. Presentation of the probability distribution of X in the form of a table x P(X = x) 1 1/6 2 1/6 3 1/6 4 1/6 5 1/6 6 1/6

  8. Properties of a Probability Distribution The probability that the random variable is equal to a specific value or falls within a range of values cannot be less than zero or greater than one. The sum of the probabilities across the possible values of the random variable is equal to one.

  9. A Joint Probability Distribution It is possible for a probability distribution to relate to more than one random variable. Where a probability distribution relates to more than one variable, it is referred to as a joint probability distribution. A joint probability distribution indicates all of the possible combinations of values of the random variables, together with their associated probabilities of occurrence.

  10. An Example of Two Random Variables Two dice are thrown. In order to be able to distinguish between them, one of the dice is coloured blue and the other dice is coloured red. X = the number which is shown on the blue dice. Y = the number which is shown on the red dice. There are thirty-six possible pairs of values of X and Y. Using the classical approach, every one of these pairs is associated with a probability of 1/36.

  11. Mathematical Presentation of the Joint Probability Distribution of X and Y P(X = x, Y = y) = 1/36, x, y = 1, 2, 3, 4, 5, 6.

  12. Joint Probability Distribution of X and Y Value of Y 1 2 3 4 5 6 Value 1 1/36 1/36 1/36 1/36 1/36 1/36 of X 2 1/36 1/36 1/36 1/36 1/36 1/36 3 1/36 1/36 1/36 1/36 1/36 1/36 4 1/36 1/36 1/36 1/36 1/36 1/36 5 1/36 1/36 1/36 1/36 1/36 1/36 6 1/36 1/36 1/36 1/36 1/36 1/36

  13. Properties of a Joint Probability Distribution The probability of any combination of values of the random variables cannot be less than zero and cannot be greater than one. The sum of the joint probabilities across all of the possible pairs of values of the random variables is equal to one.

  14. Derivation of the Probabilities of X and Y Given the table which shows the joint probability distribution of X and Y: the probabilities of the values of X can be obtained by adding together the joint probabilities in each of the rows; the probabilities of the values of Y can be obtained by adding together the joint probabilities in each of the columns.

  15. Marginal Probability Distributions of X and Y Value of Y 1 2 3 4 5 6 P(X = x) Value 1 1/36 1/36 1/36 1/36 1/36 1/36 1/6 of X 2 1/36 1/36 1/36 1/36 1/36 1/36 1/6 3 1/36 1/36 1/36 1/36 1/36 1/36 1/6 4 1/36 1/36 1/36 1/36 1/36 1/36 1/6 5 1/36 1/36 1/36 1/36 1/36 1/36 1/6 6 1/36 1/36 1/36 1/36 1/36 1/36 1/6 P(Y = y) 1/6 1/6 1/6 1/6 1/6 1/6

  16. Statistical Independence of X and Y It is apparent that, for x, y = 1, 2, 3, 4, 5, 6, P(X = x, Y = y) = P(X = x).P(Y = y) (1/36 = 1/6.1/6). As a consequence, X and Y can be pronounced as being statistically independent random variables.

  17. Measures for summarising a marginal probability distribution • A probability distribution can be summarised by its moments • The first moment equates with the population mean or the expected value of the random variable and provides a measure of central tendency • The second central moment equates with the variance of the random variable and provides a measure of dispersion • The value of the third central moment indicates whether the probability distribution is symmetrical or skewed • The fourth central moment provides a measure of kurtosis

  18. Definitions of the Expected Value of a Random Variable An expected value is a weighted sum of the possible values of the random variable, where the associated marginal probabilities serve as the weights. Mathematical definition of the expected value of a discrete random variable, X: E[X] = x.P(X = x)

  19. An Example of the Calculation of the Expected Value x P(X = x) x.P(X = x) 1 1/6 1/6 2 1/6 2/6 3 1/6 3/6 4 1/6 4/6 5 1/6 5/6 6 1/6 6/6 ----- 21/6 Thus, E[X] = 3.5

  20. Rules Relating to the Expectations Operator • Rule 1 – the expected value of a constant is equal to the value of the constant, itself • Rule 2 – the expected value of the sum of a constant and a random variable is equal to the sum of the constant and the expected value of the random variable • Rule 3 – the expected value of the product of a constant and a random variable is equal to the product of the constant and the expected value of the random variable • Rule 4 – the expected value of a linear function of a random variable is equal to the same linear function of the expected value of the random variable

  21. An Example of the Application of Rule 4 A game is played which involves throwing a dice and observing the number which is obtained. X = the number which is obtained as a result of a single throw of the dice. Probability distribution of X x 1 2 3 4 5 6 P(X = x) 1/6 1/6 1/6 1/6 1/6 1/6 E[X] = 3.5

  22. Expected Net Gain Let us suppose that £5 has to be paid for the privilege of throwing the dice. At the same time, though, there is received, in pounds, twice the number which appears on the dice. What is the expected net gain from throwing the dice? Y = the net gain which is made from throwing the dice. Inefficient approach: E[Y] = y.P(Y = y).

  23. Probability Distribution and Expected Value of Y x y P(X = x) = P(Y = y) y.P(Y = y) 1 -3 1/6 -3/6 2 -1 1/6 -1/6 3 1 1/6 1/6 4 3 1/6 3/6 5 5 1/6 5/6 6 7 1/6 7/6 ---- 12/6 E[Y] = £2

  24. More efficient approach towards calculating E[Y] Y = -5 + 2X E[Y] = E[-5 + 2X] Using E[a + bX] = a + bE[X], E[Y] = -5 + 2E[X] E[X] = 3.5, and so, on substitution, E[Y] = -5 + 2(3.5) = £2.00

More Related