Scheduled release strategy Stan’s moved to a scheduled release strategy where we’ll simply release whatever we have every three months. The Stan 2.20 release just went out last week. So you can expect Stan 2.21 in three months. Our core releases include the math library, the language compiler, and CmdStan. That requires us to keep […]
Author: Bob Carpenter
AnnoNLP conference on data coding for natural language processing
This workshop should be really interesting: Aggregating and analysing crowdsourced annotations for NLP EMNLP Workshop. November 3–4, 2019. Hong Kong. Silviu Paun and Dirk Hovy are co-organizing it. They’re very organized and know this area as well as anyone. I’m on the program committee, but won’t be able to attend. I really like the problem […]
Peter Ellis on Forecasting Antipodal Elections with Stan
I liked this intro to Peter Ellis from Rob J. Hyndman’s talk announcement: He [Peter Ellis] started forecasting elections in New Zealand as a way to learn how to use Stan, and the hobby has stuck with him since he moved back to Australia in late 2018. You may remember Peter from my previous post […]
Stan examples in Harezlak, Ruppert and Wand (2018) Semiparametric Regression with R
I saw earlier drafts of this when it was in preparation and they were great. Jarek Harezlak, David Ruppert and Matt P. Wand. 2018. Semiparametric Regression with R. UseR! Series. Springer. I particularly like the careful evaluation of variational approaches. I also very much like that it’s packed with visualizations and largely based on worked […]
StanCon 2019: 20–23 August, Cambridge, UK
It’s official. This year’s StanCon is in Cambridge. For details, see StanCon 2019 Home Page What can you expect? There will be two days of tutorials at all levels and two days of invited and submitted talks. The previous three StanCons (NYC 2017, Asilomar 2018, Helsinki 2018) were wonderful experiences for both their content and […]
Ben Lambert. 2018. A Student’s Guide to Bayesian Statistics.
Ben Goodrich, in a Stan forums survey of Stan video lectures, points us to the following book, which introduces Bayes, HMC, and Stan: Ben Lambert. 2018. A Student’s Guide to Bayesian Statistics. SAGE Publications. If Ben Goodrich is recommending it, it’s bound to be good. Amazon reviewers seem to really like it, too. You may […]
Markov chain Monte Carlo doesn’t “explore the posterior”
First some background, then the bad news, and finally the good news. Spoiler alert: The bad news is that exploring the posterior is intractable; the good news is that we don’t need to. Sampling to characterize the posterior There’s a misconception among Markov chain Monte Carlo (MCMC) practitioners that the purpose of sampling is to […]
Book reading at Ann Arbor Meetup on Monday night: Probability and Statistics: a simulation-based introduction
The Talk I’m going to be previewing the book I’m in the process of writing at the Ann Arbor R meetup on Monday. Here are the details, including the working title: Probability and Statistics: a simulation-based introduction Bob Carpenter Monday, February 18, 2019 Ann Arbor SPARK, 330 East Liberty St, Ann Arbor I’ve been to […]
Google on Responsible AI Practices
Great and beautifully written advice for any data science setting:
Google. Responsible AI Practices.
Enjoy.
NYC Meetup Thursday: Under the hood: Stan’s library, language, and algorithms
I (Bob, not Andrew!) will be doing a meetup talk next Thursday in New York City. Here’s the link with registration and location and time details (summary: pizza unboxing at 6:30 pm in SoHo): Bayesian Data Analysis Meetup: Under the hood: Stan’s library, language, and algorithms After summarizing what Stan does, this talk will focus […]
The post NYC Meetup Thursday: Under the hood: Stan’s library, language, and algorithms appeared first on Statistical Modeling, Causal Inference, and Social Science.
Melanie Miller says, “As someone who has worked in A.I. for decades, I’ve witnessed the failure of similar predictions of imminent human-level A.I., and I’m certain these latest forecasts will fall short as well. “
Melanie Miller‘s piece, Artificial Intelligence Hits the Barrier of Meaning (NY Times behind limited paywall), is spot-on regarding the hype surrounding the current A.I. boom. It’s soon to come out in book length from FSG, so I suspect I’ll hear about it again in the New Yorker. Like Professor Miller, I started my Ph.D. at […]
The post Melanie Miller says, “As someone who has worked in A.I. for decades, I’ve witnessed the failure of similar predictions of imminent human-level A.I., and I’m certain these latest forecasts will fall short as well. “ appeared first on Statistical Modeling, Causal Inference, and Social Science.
A.I. parity with the West in 2020
Someone just sent me a link to an editorial by Ken Church, in the journal Natural Language Engineering (who knew that journal was still going? I’d have thought open access would’ve killed it). The abstract of Church’s column says of China, There is a bold government plan for AI with specific milestones for parity with […]
The post A.I. parity with the West in 2020 appeared first on Statistical Modeling, Causal Inference, and Social Science.
StanCon Helsinki streaming live now (and tomorrow)
We’re streaming live right now!
Thursday 08:45-17:30: YouTube Link
Friday 09:00-17:00: YouTube Link
Timezone is Eastern European Summer Time (EEST) +0300 UTC
Here’s a link to the full program.
There have already been some great talks an…
Three informal case studies: (1) Monte Carlo EM, (2) a new approach to C++ matrix autodiff with closures, (3) C++ serialization via parameter packs
Andrew suggested I cross-post these from the Stan forums to his blog, so here goes. Maximum marginal likelihood and posterior approximations with Monte Carlo expectation maximization: I unpack the goal of max marginal likelihood and approximate Bayes with MMAP and Laplace approximations. I then go through the basic EM algorithm (with a traditional analytic example […]
The post Three informal case studies: (1) Monte Carlo EM, (2) a new approach to C++ matrix autodiff with closures, (3) C++ serialization via parameter packs appeared first on Statistical Modeling, Causal Inference, and Social Science.
Three informal case studies: (1) Monte Carlo EM, (2) a new approach to C++ matrix autodiff with closures, (3) C++ serialization via parameter packs
Andrew suggested I cross-post these from the Stan forums to his blog, so here goes. Maximum marginal likelihood and posterior approximations with Monte Carlo expectation maximization: I unpack the goal of max marginal likelihood and approximate Bayes with MMAP and Laplace approximations. I then go through the basic EM algorithm (with a traditional analytic example […]
The post Three informal case studies: (1) Monte Carlo EM, (2) a new approach to C++ matrix autodiff with closures, (3) C++ serialization via parameter packs appeared first on Statistical Modeling, Causal Inference, and Social Science.
Thanks, NVIDIA
Andrew and I both received a note like this from NVIDIA: We have reviewed your NVIDIA GPU Grant Request and are happy support your work with the donation of (1) Titan Xp to support your research. Thanks! In case other people are interested, NVIDA’s GPU grant program provides ways for faculty or research scientists to […]
The post Thanks, NVIDIA appeared first on Statistical Modeling, Causal Inference, and Social Science.
Thanks, NVIDIA
Andrew and I both received a note like this from NVIDIA: We have reviewed your NVIDIA GPU Grant Request and are happy support your work with the donation of (1) Titan Xp to support your research. Thanks! In case other people are interested, NVIDA’s GPU grant program provides ways for faculty or research scientists to […]
The post Thanks, NVIDIA appeared first on Statistical Modeling, Causal Inference, and Social Science.
Advice on soft skills for academics
Julia Hirschberg sent this along to the natural language processing mailing list at Columbia: here are some slides from last spring’s CRA-W Grad Cohort and previous years that might be of interest. all sorts of topics such as interviewing, building confidence, finding a thesis topic, preparing your thesis proposal, publishing, entrepreneurialism, and a very interesting […]
The post Advice on soft skills for academics appeared first on Statistical Modeling, Causal Inference, and Social Science.
Advice on “soft skills” for academics
Julia Hirschberg sent this along to the natural language processing mailing list at Columbia: here are some slides from last spring’s CRA-W Grad Cohort and previous years that might be of interest. all sorts of topics such as interviewing, building confidence, finding a thesis topic, preparing your thesis proposal, publishing, entrepreneurialism, and a very interesting […]
The post Advice on “soft skills” for academics appeared first on Statistical Modeling, Causal Inference, and Social Science.
Where do I learn about log_sum_exp, log1p, lccdf, and other numerical analysis tricks?
Richard McElreath inquires: I was helping a colleague recently fix his MATLAB code by using log_sum_exp and log1m tricks. The natural question he had was, “where do you learn this stuff?” I checked Numerical Recipes, but the statistical parts are actually pretty thin (at least in my 1994 edition). Do you know of any books/papers […]
The post Where do I learn about log_sum_exp, log1p, lccdf, and other numerical analysis tricks? appeared first on Statistical Modeling, Causal Inference, and Social Science.