1 / 38

Multiple Regression Analysis: Part 2

Multiple Regression Analysis: Part 2. Interpretation and Diagnostics. Learning Objectives. Understand regression coefficients and semi-partial correlations Learn to use diagnostics to locate problems with data (relative to MRA) Understand… Assumptions Robustness

fahim
Download Presentation

Multiple Regression Analysis: Part 2

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Multiple Regression Analysis: Part 2 Interpretation and Diagnostics

  2. Learning Objectives • Understand regression coefficients and semi-partial correlations • Learn to use diagnostics to locate problems with data (relative to MRA) • Understand… • Assumptions • Robustness • Methods of dealing with violations • Enhance our interpretation of equations • Understand entry methods

  3. Statistical Tests & Interpretation • Interpretation of regression coefficients • Standardized • Unstandardized • Intercept • Testing regression coefficients • t-statistic & interpretation • Testing R2

  4. Output for MRA Run (coefficients) R2 = .558

  5. Y A B X1 X2 Variance in Y Accounted for by two uncorrelated Predictors (A+B)/Y = R2, E (in Y circle) equals Error. Y E E B A X1 X2 Example #1: Small R2, A represents variance in Y accounted for by X1, B = variance in Y accounted for by X2. Example #2: Larger R2, A represents variance in Y accounted for by X1, B = variance in Y accounted for by X2.

  6. Y Y A B A B C C X1 X2 D X1 X2 D Example #1: Small R2 Example #2: Larger R2 Variance in Y Accounted for by two correlated Predictors: sr2 and pr2 sr2 for X1 = pr2 for X1 =

  7. Unique Contributions -- breaking sr2 down R2 = .558

  8. A shortcoming to breaking down sr2 R2 = .120

  9. Multicollinearity: One way it can all go bad! Y E A B C X1 X2 D

  10. Methods for diagnosing multicollinearity

  11. Ways to fix multicollinearity • Discarding Predictors • Combining Predictors • Using Principal Components • Parcelling • Ridge Regression

  12. Outliers and Influential Observations:Another way it can all go bad! • Outliers on y • Outliers on x’s • Influential data points

  13. Outliers • Outliers on y • Standardized Residuals • Studentized Residuals (df = N – k – 1) • Deleted Studentized Residuals • Outliers on x’s • Hat elements • Mahalanobis Distance

  14. Outliers on y tcrit(21) = 2.08

  15. Outliers on Xs (Leverage) χ2(crit) for Mahalanobis’ Distance = 7.82

  16. Influential Observations • Cook’s Distance (cutoff ≈ 1.0) • DFFITs [cut-offs of 2 or 2*((k+1)/n)0.5] • DFBeta • Standardized DF Beta

  17. Influence (y & leverage)

  18. Once more, with feeling R2 = .687

  19. Plot of Standardized y’ vs. Residual

  20. A cautionary tale:Some more ways it can all go bad! We will use X to predict y1, y2 and y3 in turn.

  21. Exhibit 1, x & y1

  22. Exhibit 2 (x & y2)

  23. Exhibit 3 (x & y3)

  24. Homoscadasticity:Yet another way it can all go bad! • What is homoscedasticity? • Is it better to have heteroscedasticity? • The effects of violation • How to identify it • Strategies for dealing with it

  25. A visual representation of ways that it can all go bad!

  26. Effect Size Multiple Correlation (R): SMC (R2):

  27. Cross Validation • Why • Useful statistics and techniques • Conditions under which likelihood of cross-validation is increased

  28. Assumptions of Regression • Sample Size • Absence of Outliers & Influential Observations • Absence of Multicollinearity and Singularity • Normality • Linearity • Homoscedasticity of Errors • Independence of Errors

  29. Structure Coefficients • What are they? • Vs. pattern coefficients or “weights” • Why we may need both • When they would be used in MRA • Why they are not commonly used • How you get them in SPSS • CD sales example

  30. As a reminder, the coefficients (weights)

  31. Structure coefficients R

  32. Model Building in MRA:“Canned” procedures • Enter • Forward • Backward Selection (Deletion) • Stepwise • Hierarchical

  33. Hierarchical – Example Predict employee satisfaction • Block 1: “Hygiene Factor” • Block 2: “Equity” • Block 3: “Organizational Commitment”

  34. Model Summary

  35. Analysis of Variance

  36. Coefficients for Models

  37. Let’s not forget the lesson of structure coefficients…

  38. Interpretation revisited • In light of multicollinearity • Standardized or unstandardized? • Suppressor effects • Missing predictors • Correlated / uncorrelated predictors • Structure coefficients • Reliability of indicators • Mathematical maximization nature of MRA

More Related