1 / 18

Multiple and complex regression

Multiple and complex regression . Extensions of simple linear regression. Multiple regression models: predictor variables are continuous Analysis of variance: predictor variables are categorical (grouping variables),

shlomo
Download Presentation

Multiple and complex regression

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Multiple and complex regression

  2. Extensions of simple linear regression • Multiple regression models: predictor variables are continuous • Analysis of variance: predictor variables are categorical (grouping variables), • But… general linear models can include both continuous and categorical predictors

  3. Relative abundance of C3 and C4 plants • Paruelo & Lauenroth (1996) • Geographic distribution and the effects of climate variables on the relative abundance of a number of plant functional types (PFTs): shrubs, forbs, succulents, C3 grasses and C4 grasses.

  4. Relative abundance of PTFs (based on cover, biomass, and primary production) for each site Longitude Latitude Mean annual temperature Mean annual precipitation Winter (%) precipitation Summer (%) precipitation Biomes (grassland , shrubland) data 73 sites across temperate central North America Response variable Predictor variables

  5. Box 6.1 Relative abundance transformed ln(dat+1) because positively skewed

  6. Comparing l10 vs ln

  7. Collinearity • Causes computational problems because it makes the determinant of the matrix of X-variables close to zero and matrix inversion basically involves dividing by the determinant (very sensitive to small differences in the numbers) • Standard errors of the estimated regression slopes are inflated

  8. Detecting collinearlity • Check tolerance values • Plot the variables • Examine a matrix of correlation coefficients between predictor variables

  9. Dealing with collinearity • Omit predictor variables if they are highly correlated with other predictor variables that remain in the model

  10. (lnC3)= βo+ β1(lat)+ β2(long)+ β3(latxlong) After centering both lat and long

  11. R2=0.514

  12. Analysis of variance

  13. Matrix algebra approach to OLS estimation of multiple regression models • Y=βX+ε • X’Xb=XY • b=(X’X) -1 (XY)

  14. The forward selection is

  15. The backward selection is

  16. Criteria for “best” fitting in multiple regression with p predictors.

  17. Hierarchical partitioning and model selection

More Related