Posts Tagged ‘ Statistical computing ’

Lasso regression etc in Stan

February 14, 2017
By
Lasso regression etc in Stan

Someone on the users list asked about lasso regression in Stan, and Ben replied: In the rstanarm package we have stan_lm(), which is sort of like ridge regression, and stan_glm() with family = gaussian and prior = laplace() or prior = lasso(). The latter estimates the shrinkage as a hyperparameter while the former fixes it […] The post Lasso regression etc in Stan appeared first on Statistical Modeling, Causal Inference,…

Read more »

HMMs in Stan? Absolutely!

February 7, 2017
By
HMMs in Stan?  Absolutely!

I was having a conversation with Andrew that went like this yesterday: Andrew: Hey, someone’s giving a talk today on HMMs (that someone was Yang Chen, who was giving a talk based on her JASA paper Analyzing single-molecule protein transportation experiments via hierarchical hidden Markov models). Maybe we should add some specialized discrete modules to […] The post HMMs in Stan? Absolutely! appeared first on Statistical Modeling, Causal Inference, and…

Read more »

You can fit hidden Markov models in Stan (and thus, also in Stata! and Python! and R! and Julia! and Matlab!)

February 7, 2017
By
You can fit hidden Markov models in Stan (and thus, also in Stata! and Python! and R! and Julia! and Matlab!)

You can fit finite mixture models in Stan; see section 12 of the Stan manual. You can fit change point models in Stan; see section 14.2 of the Stan manual. You can fit mark-recapture models in Stan; see section 14.2 of the Stan manual. You can fit hidden Markov models in Stan; see section 9.6 […] The post You can fit hidden Markov models in Stan (and thus, also in…

Read more »

Thanks for attending StanCon 2017!

January 30, 2017
By
Thanks for attending StanCon 2017!

Thank you all for coming and making the first Stan Conference a success! The organizers were blown away by how many people came to the first conference. We had over 150 registrants this year! StanCon 2017 Video The organizers managed to get a video stream on YouTube: https://youtu.be/DJ0c7Bm5Djk. We have over 1900 views since StanCon! (We lost […] The post Thanks for attending StanCon 2017! appeared first on Statistical Modeling, Causal Inference,…

Read more »

Come and work with us!

January 18, 2017
By

Stan is an open-source, state-of-the-art probabilistic programming language with a high-performance Bayesian inference engine written in C++. Stan had been successfully applied to modeling problems with hundreds of thousands of parameters in fields as diverse as econometrics, sports analytics, physics, pharmacometrics, recommender systems, political science, and many more. Research using Stan has been featured in […] The post Come and work with us! appeared first on Statistical Modeling, Causal Inference,…

Read more »

Stan is hiring! hiring! hiring! hiring!

January 17, 2017
By

[insert picture of adorable cat entwined with Stan logo] We’re hiring postdocs to do Bayesian inference. We’re hiring programmers for Stan. We’re hiring a project manager. How many people we hire depends on what gets funded. But we’re hiring a few people for sure. We want the best best people who love to collaborate, who […] The post Stan is hiring! hiring! hiring! hiring! appeared first on Statistical Modeling, Causal…

Read more »

Stan JSS paper out: “Stan: A probabilistic programming language”

January 13, 2017
By
Stan JSS paper out:  “Stan: A probabilistic programming language”

As a surprise welcome to 2017, our paper on how the Stan language works along with an overview of how the MCMC and optimization algorithms work hit the stands this week. Bob Carpenter, Andrew Gelman, Matthew D. Hoffman, Daniel Lee, Ben Goodrich, Michael Betancourt, Marcus Brubaker, Jiqiang Guo, Peter Li, and Allen Riddell. 2017. Stan: […] The post Stan JSS paper out: “Stan: A probabilistic programming language” appeared first on…

Read more »

“A Conceptual Introduction to Hamiltonian Monte Carlo”

January 12, 2017
By
“A Conceptual Introduction to Hamiltonian Monte Carlo”

Michael Betancourt writes: Hamiltonian Monte Carlo has proven a remarkable empirical success, but only recently have we begun to develop a rigorous understanding of why it performs so well on difficult problems and how it is best applied in practice. Unfortunately, that understanding is con- fined within the mathematics of differential geometry which has limited […] The post “A Conceptual Introduction to Hamiltonian Monte Carlo” appeared first on Statistical Modeling,…

Read more »

Michael found the bug in Stan’s new sampler

December 22, 2016
By
Michael found the bug in Stan’s new sampler

Gotcha! Michael found the bug! That was a lot of effort, during which time he produced ten pages of dense LaTeX to help Daniel and me understand the algorithm enough to help debug (we’re trying to write a bunch of these algorithmic details up for a more general audience, so stay tuned). So what was […] The post Michael found the bug in Stan’s new sampler appeared first on Statistical…

Read more »

“The Fundamental Incompatibility of Scalable Hamiltonian Monte Carlo and Naive Data Subsampling”

December 10, 2016
By

Here’s Michael Betancourt writing in 2015: Leveraging the coherent exploration of Hamiltonian flow, Hamiltonian Monte Carlo produces computationally efficient Monte Carlo estimators, even with respect to complex and high-dimensional target distributions. When confronted with data-intensive applications, however, the algorithm may be too expensive to implement, leaving us to consider the utility of approximations such as […] The post “The Fundamental Incompatibility of Scalable Hamiltonian Monte Carlo and Naive Data Subsampling”…

Read more »


Subscribe

Email:

  Subscribe