1 / 13

Some Model Selection Criteria for Regression

Some Model Selection Criteria for Regression. SSE, R 2 , Adjusted R 2 , and MSE. SSE decreases as variable is added to a model R 2 increases as variable is added to a model. Adjusted R 2 and MSE take the number of predictors into account (K=number of predictors)

lwiley
Download Presentation

Some Model Selection Criteria for Regression

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Some Model Selection Criteria for Regression

  2. SSE, R2, Adjusted R2, and MSE • SSE decreases as variable is added to a model • R2 increases as variable is added to a model • Adjusted R2 and MSE take the number of predictors into account (K=number of predictors) • Useful for comparing models with different number of variables

  3. Cp Criteria • Mallows (1973) • Let p= K+1 • Choose the model that Cpis low and close to p

  4. Akiake Criteria • An Information Criteria for normal regression model (Akaike 1973) • p=K+1 • Choose the model that minimizes AICp • Derived from information theory

  5. Schwartz Criteria • Bayesian Information Criteria for normal regression model (Schwartz 1978) • Choose the model that minimizes BICp • Bayes factor • Also denoted as SICp

  6. Some Model Selection Procedures • All subset regression • Best subset regression • Forward selection • Backward elimination • Stepwise regression • Out of sample prediction • Forecasting • Cross-validation

  7. Prediction error • Validation of an estimated model by out of sample prediction (forecast) • Leave one out • Split sampling • New data • Prediction (forecast) error error = actual - forecast

  8. Leave out one • Leave one observation, i, out • Estimate the model using the n-1 remaining data • Predict yiby • Compute the prediction error • Repeat for all i = 1,2, . . . , n

  9. Measures of overall prediction error • Prediction error sum of square • Mean square prediction error • Root Mean square prediction error

  10. Measures of overall prediction error • Mean absolute prediction error • Relative (percent) mean absolute prediction error

  11. Split sampling • Divide the sample into two subsamples • Model estimation subsample (size n1) • Model validation subsample (size n2) • Cross-sectional data • select cases randomly • Time series data • Use period t = 1, …, T1 data for estimation • Use period t = T1+1, …, T for validation

  12. Measures of overall prediction error • Mean square prediction error • Root Mean square prediction error

  13. Measures of overall prediction error • Mean absolute prediction error • Relative (percent) mean absolute prediction error

More Related