1 / 32

Review of Univariate Linear Regression

Review of Univariate Linear Regression. BMTRY 726 3/4/14. Regression Analysis. We are interested in predicting values of one or more responses from a set of predictors Regression analysis is an extension of what we discussed with ANOVA and MANOVA

nico
Download Presentation

Review of Univariate Linear Regression

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Review of Univariate Linear Regression BMTRY 726 3/4/14

  2. Regression Analysis We are interested in predicting values of one or more responses from a set of predictors Regression analysis is an extension of what we discussed with ANOVA and MANOVA We now allow for inclusion of continuous predictors in place of (or in addition to) treatment indicators in MANOVA

  3. Univariate Regression Analysis A univariate regression model states that response Y is composed of a mean dependent on a set of independent predictors zi and random error ei Model assumptions:

  4. Least Squares Estimation We use the method of least square to estimate or model parameters

  5. Least Squares Estimation We estimate the variance using the residuals

  6. Least Squares Estimation

  7. Geometry of Least Squares

  8. LRT for individual bi’s First we may test if any predictors effect the response: The LRT is based on the difference in sums of square between the full and null models…

  9. LRT for individual bi’s Difference in SS between the full and null models…

  10. Model Building If we have a large number of predictors, we want to identify the “best” subset There are many methods of selecting the “best” -Examine all possible subsets of predictors -Forward stepwise selection -Backwards stepwise selection

  11. Model Building Though we can consider predictors that are significant, this may not yield the “best” subset (some models may yield similar results) The “best” choice is made by examining some criterion -R2 -adjusted R2 -Mallow’s Cp -AIC Since R2 increases as predictors are added, Mallow’s Cp and AIC are better choices for selecting the “best” predictor subset

  12. Model Building Mallow’s Cp: -Plot the pairs ( p, Cp). -Good models have coordinates near the 45o line Akaike’s Information Criterion -Smaller is better

  13. Model Checking Always good to check if the model is “correct” before using it to make decisions… Information about fit is contained in the residuals If the model fits well the estimated error terms should mimic N(0, s2). So how can we check?

  14. Model Checking • Studentized residuals plot • Plot residuals versus predicted values -Ideally points should be scattered (i.e. no pattern) -If a pattern exists, can show something about the problem

  15. Model Checking • Plot residuals versus predictors • QQ plot of studentized residuals plot

  16. Model Checking While residuals analysis is useful, it may miss outliers- i.e. observations that are very influential on predictions Leverage: -how far is the jth observation from the others? -How much pull does j exert on the fit Observations that affect inferences are influential

  17. Collinearity If Z is not full rank, a linear combination aZof columns in Z =0 In such a case the columns are colinear in which case the inverse of Z’Zdoesn’t exist It is rare that aZ== 0, but if a combination exists that is nearly zero, (Z’Z)-1 is numerically unstable Results in very large estimated variance of the model parameters making it difficult to identify significant regression coefficients

  18. Collinearity We can check for severity of multicollinearity using the variance inflation factor (VIF)

  19. Misspecified Model If important predictors are omitted, the vector of regression coefficients my be biased Biased unless columns of Z1 and Z2 are independent.

  20. Example Develop a model to predict percent body fat (PBF) using: -Age, Weight, Height, Chest, Abdomen, Hip, Arm, Wrist Our full model is lm(formula = PBF ~ Age + Wt + Ht + Chest + Abd + Hip + Arm + Wrist, data = SSbodyfat) Coefficients: Estimate Std. Error t value Pr(>|t|) (Intercept) -16.67443 15.64951 -1.065 0.287711 Age 0.03962 0.02983 1.328 0.185322 Wt -0.07582 0.04704 -1.612 0.108274 Ht -0.10940 0.09398 -1.164 0.245502 Chest -0.06304 0.09758 -0.646 0.518838 Abd 0.94626 0.08556 11.059 < 2e-16 *** Hip -0.07802 0.13584 -0.574 0.566263 Arm 0.50335 0.18791 2.679 0.007896 ** Wrist -1.78671 0.50204 -3.559 0.000448 *** Residual standard error: 4.347 on 243 degrees of freedom Multiple R-squared: 0.7388, Adjusted R-squared: 0.7303 F-statistic: 85.94 on 8 and 243 DF, p-value: < 2.2e-16

  21. LRT What if we want to test whether or not the 4 most non-significant predictors in the model can be removed

  22. Model Subset Selection First consider the plot of SSres for all possible subsets of the eight predictors

  23. Model Subset Selection What about Mallow’s Cp and AIC?

  24. Model Say we choosing the model with the 4 parameters. > summary(mod4) Call: lm(formula = PBF ~ Wt + Abd + Arm + Wrist, data = SSbodyfat, x = T) coefficients: Estimate Std. Error t value Pr(>|t|) (Intercept) -34.85407 7.24500 -4.811 2.62e-06 *** Wt -0.13563 0.02475 -5.480 1.05e-07 *** Abd 0.99575 0.05607 17.760 < 2e-16 *** Arm 0.47293 0.18166 2.603 0.009790 ** Wrist -1.50556 0.44267 -3.401 0.000783 *** Residual standard error: 4.343 on 247 degrees of freedom Multiple R-squared: 0.735, Adjusted R-squared: 0.7307 F-statistic: 171.3 on 4 and 247 DF, p-value: < 2.2e-16

  25. Model Check Say we choosing the model with 4 parameters. We need to check our regression diagnostics

  26. Model Check What about our parameters versus the residuals?

  27. Model Check What about influential points? > SSbodyfat[c(39,175,159,206,36),] Obs PBF Wt Abd Arm Wrist 39 35.2 363.15 148.1 29.0 21.4 175 25.3 226.75 108.8 21.0 20.1 159 12.5 136.50 76.6 34.9 16.9 206 16.6 208.75 96.3 23.1 19.4 36 40.1 191.75 113.1 29.8 17.0

  28. Model Check What about collinearity? > round(cor(SSbodyfat[,c(2,3,6,8,9)]), digits=3) Wt Abd Arm Wrist Wt 1.000 0.888 0.630 0.730 Abd0.888 1.000 0.503 0.620 Arm 0.6300.503 1.000 0.586 Wrist 0.7300.6200.586 1.000 >library(HH) >vif(mod4) Wt Abd Arm Wrist 7.040774 4.864380 1.793374 2.273047

  29. What do all our model checks tell us about the validity of out model?

  30. What if our investigator really felt all 13 predictors really would give the best model? > summary(mod13) Call: lm(formula = PBF ~ Age + Wt + Ht + Neck + Chest + Abd + Hip + Thigh + Knee + Ankle + Bicep + Arm + Wrist, data = bodyfat, x = T) Coefficients: Estimate Std. Error t value Pr(>|t|) (Intercept) -18.18849 17.34857 -1.048 0.29551 Age 0.06208 0.03235 1.919 0.05618 . Wt -0.08844 0.05353 -1.652 0.09978 . Ht -0.06959 0.09601 -0.725 0.46925 Neck -0.47060 0.23247 -2.024 0.04405 * Chest -0.02386 0.09915 -0.241 0.81000 Abd 0.95477 0.08645 11.044 < 2e-16 *** Hip -0.20754 0.14591 -1.422 0.15622 Thigh 0.23610 0.14436 1.636 0.10326 Knee 0.01528 0.24198 0.063 0.94970 Ankle 0.17400 0.22147 0.786 0.43285 Bicep 0.18160 0.17113 1.061 0.28966 Arm 0.45202 0.19913 2.270 0.02410 * Wrist -1.62064 0.53495 -3.030 0.00272 ** Residual standard error: 4.305 on 238 degrees of freedom. Multiple R-squared: 0.749, Adjusted R-squared: 0.7353 . F-statistic: 54.65 on 13 and 238 DF, p-value: < 2.2e-16

  31. Is collinrearity problematic? > vif(mod13) Age Wt Ht Neck Chest Abd Hip 2.250450 33.509320 1.674591 4.324463 9.460877 11.767073 14.796520 Thigh Knee Ankle Bicep Arm Wrist 7.777865 4.612147 1.907961 3.619744 2.192492 3.377515

More Related