Posts Tagged ‘ Statistical computing ’

Design top down, Code bottom up

May 22, 2017
By

Top-down design means designing from the client application programmer interface (API) down to the code. The API lays out a precise functional specification, which says what the code will do, not how it will do it. Coding bottom up means coding the lowest-level foundations first, testing them, then continuing to build. Sometimes this requires dropping […] The post Design top down, Code bottom up appeared first on Statistical Modeling, Causal…

Read more »

A continuous hinge function for statistical modeling

May 19, 2017
By
A continuous hinge function for statistical modeling

This comes up sometimes in my applied work: I want a continuous “hinge function,” something like the red curve above, connecting two straight lines in a smooth way. Why not include the sharp corner (in this case, the function y=-0.5*x if x0)? Two reasons. First, computation: Hamiltonian Monte Carlo can trip on discontinuities. Second, I […] The post A continuous hinge function for statistical modeling appeared first on Statistical Modeling,…

Read more »

Using Stan for week-by-week updating of estimated soccer team abilites

May 17, 2017
By
Using Stan for week-by-week updating of estimated soccer team abilites

Milad Kharratzadeh shares this analysis of the English Premier League during last year’s famous season. He fit a Bayesian model using Stan, and the R markdown file is here. The analysis has three interesting features: 1. Team ability is allowed to continuously vary throughout the season; thus, once the season is over, you can see […] The post Using Stan for week-by-week updating of estimated soccer team abilites appeared first…

Read more »

Should computer programming be a prerequisite for learning statistics?

May 14, 2017
By

[cat picture] This came up in a recent discussion thread, I can’t remember exactly where. A commenter pointed out, correctly, that you shouldn’t require computer programming as a prerequisite for a statistics course: there’s lots in statistics that can be learned without knowing how to program. Sure, if you can program you can do a […] The post Should computer programming be a prerequisite for learning statistics? appeared first on…

Read more »

Splines in Stan! (including priors that enforce smoothness)

May 13, 2017
By
Splines in Stan! (including priors that enforce smoothness)

Milad Kharratzadeh shares a new case study. This could be useful to a lot of people. And here’s the markdown file with every last bit of R and Stan code. Just for example, here’s the last section of the document, which shows how to simulate the data and fit the model graphed above: Location of […] The post Splines in Stan! (including priors that enforce smoothness) appeared first on Statistical…

Read more »

I hate R, volume 38942

April 26, 2017
By

link R doesn’t allow block comments. You have to comment out each line, or you can encapsulate the block in if(0){} which is the world’s biggest hack. Grrrrr. P.S. Just to clarify: I want block commenting not because I want to add long explanatory blocks of text to annotate my scripts. I want block commenting […] The post I hate R, volume 38942 appeared first on Statistical Modeling, Causal Inference,…

Read more »

Fitting hierarchical GLMs in package X is like driving car Y

April 17, 2017
By

Given that Andrew started the Gremlin theme, I thought it would only be fitting to link to the following amusing blog post: Chris Brown: Choosing R packages for mixed effects modelling based on the car you drive (on the seascape models blog) It’s exactly what it says on the tin. I won’t spoil the punchline, […] The post Fitting hierarchical GLMs in package X is like driving car Y appeared…

Read more »

Bayesian Posteriors are Calibrated by Definition

April 12, 2017
By
Bayesian Posteriors are Calibrated by Definition

Time to get positive. I was asking Andrew whether it’s true that I have the right coverage in Bayesian posterior intervals if I generate the parameters from the prior and the data from the parameters. He replied that yes indeed that is true, and directed me to: Cook, S.R., Gelman, A. and Rubin, D.B. 2006. […] The post Bayesian Posteriors are Calibrated by Definition appeared first on Statistical Modeling, Causal…

Read more »

Stacking, pseudo-BMA, and AIC type weights for combining Bayesian predictive distributions

April 11, 2017
By
Stacking, pseudo-BMA, and AIC type weights for combining Bayesian predictive distributions

This post is by Aki. We have often been asked in the Stan user forum how to do model combination for Stan models. Bayesian model averaging (BMA) by computing marginal likelihoods is challenging in theory and even more challenging in practice using only the MCMC samples obtained from the full model posteriors. Some users have […] The post Stacking, pseudo-BMA, and AIC type weights for combining Bayesian predictive distributions appeared…

Read more »

“Scalable Bayesian Inference with Hamiltonian Monte Carlo” (Michael Betancourt’s talk this Thurs at Columbia)

April 4, 2017
By

Scalable Bayesian Inference with Hamiltonian Monte Carlo Despite the promise of big data, inferences are often limited not by sample size but rather by systematic effects. Only by carefully modeling these effects can we take full advantage of the data—big data must be complemented with big models and the algorithms that can fit them. One […] The post “Scalable Bayesian Inference with Hamiltonian Monte Carlo” (Michael Betancourt’s talk this Thurs…

Read more »


Subscribe

Email:

  Subscribe