Posts Tagged ‘ Statistical computing ’

“The Fundamental Incompatibility of Scalable Hamiltonian Monte Carlo and Naive Data Subsampling”

December 10, 2016
By

Here’s Michael Betancourt writing in 2015: Leveraging the coherent exploration of Hamiltonian flow, Hamiltonian Monte Carlo produces computationally efficient Monte Carlo estimators, even with respect to complex and high-dimensional target distributions. When confronted with data-intensive applications, however, the algorithm may be too expensive to implement, leaving us to consider the utility of approximations such as […] The post “The Fundamental Incompatibility of Scalable Hamiltonian Monte Carlo and Naive Data Subsampling”…

Read more »

Some U.S. demographic data at zipcode level conveniently in R

December 2, 2016
By

Ari Lamstein writes: I chuckled when I read your recent “R Sucks” post. Some of the comments were a bit … heated … so I thought to send you an email instead. I agree with your point that some of the datasets in R are not particularly relevant. The way that I’ve addressed that is […] The post Some U.S. demographic data at zipcode level conveniently in R appeared first…

Read more »

Deep learning, model checking, AI, the no-homunculus principle, and the unitary nature of consciousness

November 21, 2016
By
Deep learning, model checking, AI, the no-homunculus principle, and the unitary nature of consciousness

Bayesian data analysis, as my colleagues and I have formulated it, has a human in the loop. Here’s how we put it on the very first page of our book: The process of Bayesian data analysis can be idealized by dividing it into the following three steps: 1. Setting up a full probability model—a joint […] The post Deep learning, model checking, AI, the no-homunculus principle, and the unitary nature…

Read more »

Only on the internet . . .

November 17, 2016
By

I had this bizarrely escalating email exchange. It started with this completely reasonable message: Professor, I was unable to run your code here: https://www.r-bloggers.com/downloading-option-chain-data-from-google-finance-in-r-an-update/ Besides a small typo [you have a 1 after names (options)], the code fails when you actually run the function. The error I get is a lexical error: Error: lexical error: […] The post Only on the internet . . . appeared first on Statistical Modeling,…

Read more »

Kaggle Kernels

November 15, 2016
By

Anthony Goldbloom writes: In late August, Kaggle launched an open data platform where data scientists can share data sets. In the first few months, our members have shared over 300 data sets on topics ranging from election polls to EEG brainwave data. It’s only a few months old, but it’s already a rich repository for […] The post Kaggle Kernels appeared first on Statistical Modeling, Causal Inference, and Social Science.

Read more »

Stan Webinar, Stan Classes, and StanCon

November 14, 2016
By

This post is by Eric. We have a number of Stan related events in the pipeline. On 22 Nov, Ben Goodrich and I will be holding a free webinar called Introduction to Bayesian Computation Using the rstanarm R Package. Here is the abstract: The goal of the rstanarm package is to make it easier to use Bayesian […] The post Stan Webinar, Stan Classes, and StanCon appeared first on Statistical Modeling, Causal…

Read more »

Stan Case Studies: A good way to jump in to the language

November 14, 2016
By

Wanna learn Stan? Everybody’s talking bout it. Here’s a way to jump in: Stan Case Studies. Find one you like and try it out. P.S. I blogged this last month but it’s so great I’m blogging it again. For this post, the target ...

Read more »

Recently in the sister blog and elsewhere

November 8, 2016
By

Why it can be rational to vote (see also this by Robert Wiblin, “Why the hour you spend voting is the most socially impactful of all”) Be skeptical when polls show the presidential race swinging wildly The polls of the future will be reproducible and open source Testing the role of convergence in language acquisition, […] The post Recently in the sister blog and elsewhere appeared first on Statistical Modeling,…

Read more »

Why I prefer 50% rather than 95% intervals

November 5, 2016
By

I prefer 50% to 95% intervals for 3 reasons: 1. Computational stability, 2. More intuitive evaluation (half the 50% intervals should contain the true value), 3. A sense that in aplications it’s best to get a sense of where the parameters and predicted values will be, not to attempt an unrealistic near-certainty. This came up […] The post Why I prefer 50% rather than 95% intervals appeared first on Statistical…

Read more »

Michael Betancourt has made NUTS even more awesome and efficient!

November 3, 2016
By
Michael Betancourt has made NUTS even more awesome and efficient!

In an beautiful new paper, Betancourt writes: The geometric foundations of Hamiltonian Monte Carlo implicitly identify the optimal choice of [tuning] parameters, especially the integration time. I then consider the practical consequences of these principles in both existing algorithms and a new implementation called Exhaustive Hamiltonian Monte Carlo [XMC] before demonstrating the utility of these […] The post Michael Betancourt has made NUTS even more awesome and efficient! appeared first…

Read more »


Subscribe

Email:

  Subscribe