1 / 7

Lasso/LARS summary

Lasso/LARS summary. Nasimeh Asgarian. Lasso Summary Least Absolute Shrinkage and Selection operator. Given a set of input measurements x 1 ,x 2 , …,x p and outcome measurement y, the lasso fits a linear model: ŷ =  0 +  1 *x 1 +  2 *x 2 +…+  p *x p By minimizing

renee
Download Presentation

Lasso/LARS summary

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Lasso/LARS summary Nasimeh Asgarian

  2. Lasso SummaryLeast Absolute Shrinkage and Selection operator • Given a set of input measurements x1,x2, …,xp and outcome measurement y, the lasso fits a linear model: ŷ = 0+1*x1+2*x2+…+ p*xp • By minimizing ((y-ŷ)2) • Subject to  |j| <= s

  3. Computation of the lasso solution • Start with all j = 0 • Find the predictor xj most correlated with y and add it to the model • Take residuals r = y – ŷ • Continue, at each stage add the predictor most correlated with r, to the model • Until all predictors are in the model

  4. Lars SummaryLeast Angel Regression • Lasso is a restricted version of Lars • By minimizing L(, ) = ||y -  * X||2 +  ||1 • LARS: uses least square directions in the active set of variables. • Lasso: uses least square directions; if a variable crosses zero, it is removed from the active set.

  5. Computation of the Lars solution: • Start with all j = 0 • Find the predictor xj most correlated with y • Increase the coefficient j in the direction of the sign of its correlation with y • Take residuals r = y – ŷ

  6. Computation of the lasso solution:Lars (Least Angel Regression) • Stop when some other predictor xk has as much correlation with r as xj has. • Increase (j,k) in their joint least square direction, until some other predictor xm has as much correlation with the residual r. • Continue until all predictors are in the model.

  7. Lasso: choice of tuning parameters • At each step of LOO CV, • Do 10-fold CV, on training set (twice) • Find optimal values of  and number of iteration based on 10-fold CV result. i.e. see which  value and how many number of steps gives maximum correlation coefficient. • Choose this  and number of iteration to build the model for the test instance.

More Related