1 / 13

CHAPTER

4. x. ECONOMETRICS. x. CHAPTER. x. x. x. Multiple Regression = more than one explanatory variable. Y i = B 1 + B 2 X 2i + B 3 X 3i + u i. Independent variables are X 2 and X 3. X 2i is the i th observation of X 2. Y i = B 1 + B 2 X 2i + B 3 X 3i + u i.

neve-wagner
Download Presentation

CHAPTER

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 4 x ECONOMETRICS x CHAPTER x x x Multiple Regression = more than one explanatory variable Yi= B1 + B2 X2i + B3X3i + ui Independent variables are X2 and X3. X2i is the ith observation of X2.

  2. Yi= B1 + B2 X2i + B3X3i + ui B2 and B3 are partial regression coefficients. • B2 measures the change in E(Y) holding the value of X3 constant. Yi= b1 + b2 X2i + b3X3i + ei Sample regression function with parameter estimates.

  3. Estimating the impact of GDP and population on education expenditures. Educi= 414 + 0.052 GDPi - 50 Popi Holding population fixed, education spending increases 5.2¢ for every $1 of GDP. Educi= -161 + 0.048 GDPi GDP and population are correlated. When we don’t control for population, part of the population effect gets picked up by GDP.

  4. Estimating the impact of GDP and population on education expenditures. Educi= 414 + 0.052 GDPi - 50 Popi Holding GDP fixed, education spending decreases $50 for each additional person. Educi= 2,946 + 78.7 Popi When we don’t control for GDP, population picks up the GDP effect.

  5. The Classical Linear Regression Model One more assumption 8. No exact linear relationship between explanatory variables, i.e. no multicollinearity. Example of multicollinearity: X2 = population of the state X3 = female population of the state X4 = male population of the state Linear relationship: X2 = X3 + X4

  6. Second example of multicollinearity: X2 = % females in the state X3 = % males in the state Linear relationship: X2 = 1 - X3 Perfect collinearity is rare; error message if it happens. Regression is possible with high collinearity – but caution in interpretation of coefficients is needed.

  7. ∑ ei2 n – k σ2= Estimation of Parameters Procedures for estimating parameters using OLS are the same (the equations just become more complicated.) Standard errors of the estimators are calculated in much the same way. We estimate the variance of the disturbance term in the population from the residuals in the sample. k represents the number of coefficients estimated.

  8. Estimating Goodness of Fit Hypothesis Testing As before, R2 is used as a measure of goodness of fit. R2 = ESS / TSS Testing the null hypothesis that Bi = 0 is the same as before except: df = n - k

  9. The test of significance approach to hypothesis testing Educi= 414 + 0.052 GDPi - 50 Popi Test statistic: t = b1 / se(b1) = 414 / 267 = 1.55 p = TDIST(t, df, tails) 1 tail: p = 0.065 2 tails: p = 0.13 t -1.55 0 1.55

  10. Testing the Joint Hypothesis that B2=B3=0 R2 / (k - 1) F= (1 – R2) / (n – k) Testing that all the coefficients* are equal to zero is the same as testing that R2=0. * Not necessarily the intercept, B1. F follows the F distribution with (k-1) df in the numerator and (n-k) df in the denominator.

  11. 0.962/ 2 F= 0.038 / 35 From the regression of education expenditures on GDP and population (R2 = 0.962): = 443.0 p = FDIST(F, df, tails) p = FDIST(443, 2, 35) = 1.6 E-25 = 0.000 * Note: This number is reported in standard regression output.

  12. Adjusted R2 n – 1 1 – (1 – R2) R2= n – k Adjusted R2 is a goodness of fit measure that is adjusted for the number of explanatory variables. R2 always increases as you add explanatory variables. Adjusted R2 does not.

More Related