1 / 14

Multiple Regression II

Multiple Regression II. KNNL – Chapter 7. Extra Sums of Squares. For a given dataset, the total sum of squares remains the same, no matter what predictors are included (when no missing values exist among variables)

tait
Download Presentation

Multiple Regression II

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Multiple Regression II KNNL – Chapter 7

  2. Extra Sums of Squares • For a given dataset, the total sum of squares remains the same, no matter what predictors are included (when no missing values exist among variables) • As we include more predictors, the regression sum of squares (SSR) increases (technically does not decrease), and the error sum of squares (SSE) decreases • SSR + SSE = SSTO, regardless of predictors in model • When a model contains just X1, denote: SSR(X1), SSE(X1) • Model Containing X1, X2: SSR(X1,X2), SSE(X1,X2) • Predictive contribution of X2 above that of X1: SSR(X2|X1) = SSE(X1) – SSE(X1,X2) = SSR(X1,X2) – SSR(X1) • Extends to any number of Predictors

  3. Definitions and Decomposition of SSR Note that as the # of predictors increases, so does the ways of decomposing SSR

  4. ANOVA – Sequential Sum of Squares

  5. Extra Sums of Squares & Tests of Regression Coefficients (Single bk)

  6. Extra Sums of Squares & Tests of Regression Coefficients (Multiple bk)

  7. Extra Sums of Squares & Tests of Regression Coefficients (General Case)

  8. Other Linear Tests

  9. Coefficients of Partial Determination-I Switch 1 and 2

  10. Coefficients of Partial Determination-II

  11. Standardized Regression Model - I • Useful in removing round-off errors in computing (X’X)-1 • Makes easier comparison of magnitude of effects of predictors measured on different measurement scales • Coefficients represent changes in Y (in standard deviation units) as each predictor increases 1 SD (holding all others constant) • Since all variables are centered, no intercept term

  12. Standardized Regression Model - II

  13. Standardized Regression Model - III

  14. Multicollinearity • Consider model with 2 Predictors (this generalizes to any number of predictors) Yi = b0+b1Xi1+b2Xi2+ei • When X1 and X2 are uncorrelated, the regression coefficients b1 and b2 are the same whether we fit simple regressions or a multiple regression, and: SSR(X1) = SSR(X1|X2) SSR(X2) = SSR(X2|X1) • When X1 and X2 are highly correlated, their regression coefficients become unstable, and their standard errors become larger (smaller t-statistics, wider CIs), leading to strange inferences when comparing simple and partial effects of each predictor • Estimated means and Predicted values are not affected

More Related