(This article was originally published at Statistical Modeling, Causal Inference, and Social Science, and syndicated at StatsBlogs.)

Someone on the users list asked about lasso regression in Stan, and Ben replied:

In the rstanarm package we have stan_lm(), which is sort of like ridge regression, and stan_glm() with family = gaussian and prior = laplace() or prior = lasso(). The latter estimates the shrinkage as a hyperparameter while the former fixes it to a specified value. Again, there are possible differences in scaling but you should get good predictions. Also, there is prior = hs() or prior = hs_plus() that implement hierarchical shrinkage on the coefficients.

We discussed horseshoe in Stan awhile ago, and there’s more to be said on this topic, including the idea of postprocessing the posterior inferences if there’s a desire to pull some coefficients all the way to zero. And informative priors on the scaling parameters: yes, these hyperparameters can be estimated from data alone, but such estimates can be unstable, and some prior information should be helpful. What we really need are a bunch of examples applying these models to real problems.

The post Lasso regression etc in Stan appeared first on Statistical Modeling, Causal Inference, and Social Science.

**Please comment on the article here:** **Statistical Modeling, Causal Inference, and Social Science**