1 / 10

Collinearity

Collinearity. The Problem of Large Correlations Among the Independent Variables. What is collinearity? Why is it a problem?. How do I know if I’ve got it? What can I do about it?. Skill Set. Collinearity Defined.

yasuo
Download Presentation

Collinearity

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Collinearity The Problem of Large Correlations Among the Independent Variables

  2. What is collinearity? Why is it a problem? How do I know if I’ve got it? What can I do about it? Skill Set

  3. Collinearity Defined • Within the set of IVs, one or more IVs are (nearly) totally predicted by the other IVs. • In such a case, the b or beta weights are poorly estimated. • Problem of the “Bouncing Betas.”

  4. Diagnostics 1. Variance Inflation Factor (VIF). Standard error of the b weight with 2 IVs: Sampling Variance of b weight VIF

  5. VIF (2) Standard Error with k predictors: Large values of VIF are trouble. Some say values > 10 are high.

  6. Tolerance Tolerance is Small values are trouble. Maybe .10?

  7. Number Eigenval Condition Variance Proportions Index Constant X1 X2 X3 1 3.771 1.00 .004 .006 .006 .008 2 .106 5.969 .003 .029 .268 .774 3 .079 6.90 .000 .749 .397 .066 4 .039 9.946 .993 .215 .329 .152 Condition Index Lambda is an eigenvalue. Number refers to a linear combination of the predictors. Eigenvalue refers to the variance of that combination. Collinearity is spotted by finding 2 or more variables that have large proportions of variance (.50 or more) that correspond to large condition indices. A rule of thumb is to label as large those condition indices in the range of 30 or larger. No apparent problem here.

  8. Number Eigenval Condition Variance Proportions Index Constant X1 X2 X3 1 3.819 1.00 .004 .006 .002 .002 2 .117 5.707 .043 .384 .041 .087 3 .047 9.025 .876 .608 .001 .042 4 .017 15.128 .077 .002 .967 .868 Condition Index (2) The last condition index (15.128) is highly associated with X2 and X3. The b weights for X2 and X3 are probably not well estimated.

  9. Dealing with Collinearity • Lump it. Admit ambiguity; SE of b weights. Refer also to correlations. • Select or combine variables. • Factor analyze set of IVs. • Use another type of analysis (e.g., path analysis). • Use another type of regression (ridge regression). • Unit weights (no longer regression).

  10. Review • What is collinearity? • Why is collinearity a problem? • What is the VIF? • What is Tolerance? • What is a condition index? • What are some things you can do to deal with collinearity?

More Related