1 / 52

Multiple Regression

STAT 101 Dr. Kari Lock Morgan. Multiple Regression. SECTIONS 9.2, 10.1, 10.2 Multiple explanatory variables (10.1) Partitioning variability – R 2 , ANOVA (9.2) Conditions – residual plot (10.2). Exam 2 Grades: In-Class. Exam 2 Re-grades.

ronli
Download Presentation

Multiple Regression

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. STAT 101 Dr. Kari Lock Morgan Multiple Regression • SECTIONS 9.2, 10.1, 10.2 • Multiple explanatory variables (10.1) • Partitioning variability – R2 , ANOVA (9.2) • Conditions – residual plot (10.2)

  2. Exam 2 Grades: In-Class

  3. Exam 2 Re-grades • Re-grade requests due in writing by class on Monday, 4/15/14 • Partial credit will not be altered – only submit a re-grade request if you think you have entirely the correct answer but got points off • Grades may go up or down • If points were added up incorrectly, just bring your exam to your TA (no need for an official re-grade)

  4. More than 2 variables! • Today we’ll finally learn a way to handle more than 2 variables!

  5. Multiple Regression • Multiple regressionextends simple linear regression to include multiple explanatory variables:

  6. Grade on Final • We’ll use your current grades to predict final exam scores, based on a model from previous 101 students • Response: final exam score • Explanatory: hw average, clicker average, exam 1, exam 2

  7. Grade on Final What variable is the most significant predictor of final exam score? Homework average Clicker average Exam 1 Exam 2

  8. Inference for Coefficients The p-value for explanatory variable xi is associated with the hypotheses For intervals and p-values of coefficients in multiple regression, use a t-distribution with degrees of freedomn – k – 1, where k is the number of explanatory variables included in the model

  9. Grade on Final Estimate your score on the final exam. What type of interval do you want for this estimate? Confidence interval Prediction interval

  10. Grade on Final Estimate your score on the final exam. (for this data hw average was out of 10, clicker average was out of 2)

  11. Grade on Final Is the clicker coefficient really negative?!?

  12. Grade on Final Is your score on exam 2 really not a significant predictor of your final exam score?!?

  13. Coefficients • The coefficient (and significance) for each explanatory variable depend on the other variables in the model!

  14. Grade on Final If you take Exam 1 out of the model… Now Exam 2 is significant! Model with Exam 1:

  15. Multiple Regression • The coefficient for each explanatory variable is the predicted change in y for one unit change in x, given the other explanatory variables in the model! • The p-value for each coefficient indicates whether it is a significant predictor of y, given the other explanatory variables in the model! • If explanatory variables are associated with each other, coefficients and p-values will change depending on what else is included in the model

  16. Grade on Final If you include Project 1 in the model… Model without Project 1:

  17. Grades

  18. Evaluating a Model • How do we evaluate the success of a model? • How we determine the overall significance of a model? • How do we choose between two competing models?

  19. Variability • One way to evaluate a model is to partition variability • A good model “explains” a lot of the variability in Y Variability Explained by the Model Total Variability Error Variability

  20. Exam Scores • Without knowing the explanatory variables, we can say that a person’s final exam score will probably be between 60 and 98 (the range of Y) • Knowing hw average, clicker average, exam 1 and 2 grades, and project 1 grades, we can give a narrower prediction interval for final exam score • We say the some of the variability in y is explained bythe explanatory variables • How do we quantify this?

  21. Variability • How do we quantify variability in Y? • Standard deviation of Y • Sum of squared deviations from the mean of Y • (a) or (b) • None of the above

  22. Sums of Squares Variability Explained by the model Total Variability Error variability SST SSM SSE

  23. Variability • If SSM is much higher than SSE, than the model explains a lot of the variability in Y

  24. R2 Variability Explained by the Model Total Variability • R2 is the proportion of the variability in Y that is explained by the model

  25. R2 • For simple linear regression, R2 is just the squared correlation between X and Y • For multiple regression, R2 is the squared correlation between the actual values and the predicted values

  26. R2

  27. Final Exam Grade

  28. Is the model significant? • If we want to test whether the model is significant (whether the model helps to predict y), we can test the hypotheses: • We do this with ANOVA!

  29. ANOVA for Regression k: number of explanatory variables n: sample size

  30. ANOVA for Regression

  31. Final Exam Grade

  32. Simple Linear Regression • For simple linear regression, the following tests will all give equivalent p-values: • t-test for non-zero correlation • t-test for non-zero slope • ANOVA for regression

  33. Mean Square Error (MSE) • Mean square error (MSE) measures the average variability in the errors (residuals) • The square root of MSE gives the standard deviation of the residuals (giving a typical distance of points from the line) • This number is also given in the R output as the residual standard error, and is known as sin the textbook

  34. Final Exam Grade

  35. Simple Linear Model Residual standard error = MSE = seestimates the standard deviation of the residuals (the spread of the normal distributions around the predicted values)

  36. Residual Standard Error • Use the fact that the residual standard error is 5.494 and your predicted final exam score to compute an approximate 95% prediction interval for your final exam score • NOTE: This calculation only takes into account errors around the line, not uncertainty in the line itself, so your true prediction interval will be slightly wider

  37. Revisiting Conditions • For simple linear regression, we learned that the following should hold for inferences to be valid: • Linearity • Constant variability of the residuals • Normality of the residuals • How do we assess the first two conditions in multiple regression, when we can no longer visualize with a scatterplot?

  38. Residual Plot • A residual plot is a scatterplot of the residuals against the predicted responses • Should have: • No obvious pattern • Constant variability

  39. Residual Plots Obvious pattern Variability not constant

  40. Final Exam Score Are the conditions satisfied? (a) Yes (b) No

  41. Conditions • What if the conditions for inference aren’t met??? • Option 1 (best option): Take STAT 210 and learn more about modeling! • Option 2: Try a transformation…

  42. Transformations • If the conditions are not satisfied, there are some common transformations you can apply to the response variable • You can take any function of y and use it as the response, but the most common are • log(y) (natural logarithm - ln) • y (square root) • y2 (squared) • ey (exponential))

  43. log(y) Original Response, y: Logged Response, log(y):

  44. y Original Response, y: Square root of Response, y:

  45. y2 Original Response, y: Squared response, y2:

  46. ey Original Response, y: Exponentiated Response, ey:

  47. Transformations • Interpretation becomes a bit more complicated if you transform the response – it should only be done if it clearly helps the conditions to be met • If you transform the response, be careful when interpreting coefficients and predictions • The slope will now have different meaning, and predictions and confidence/prediction intervals will be for the transformed response

  48. Transformations • You do NOT need to know which transformation would be appropriate for given data on the final, but they may help if conditions are not met for Project 2 or for future data you may want to analyze

  49. To Come… • How do we decide which explanatory variables to include in the model? • How do we use categorical explanatory variables? • What if the coefficient of one explanatory variable depends on the value of another explanatory variable?

  50. Project 2 • Project done in your lab groups – one project per group • 10 page (max) paper: due Wednesday, 4/23 • Choose one quantitative variable and answer questions about it and it’s relationship with other variables • Use multiple regression and anything else we’ve learned in the course • Project 2 Details here

More Related