Author: Bob Carpenter

Melanie Miller says, “As someone who has worked in A.I. for decades, I’ve witnessed the failure of similar predictions of imminent human-level A.I., and I’m certain these latest forecasts will fall short as well. “

Melanie Miller‘s piece, Artificial Intelligence Hits the Barrier of Meaning (NY Times behind limited paywall), is spot-on regarding the hype surrounding the current A.I. boom. It’s soon to come out in book length from FSG, so I suspect I’ll hear about it again in the New Yorker. Like Professor Miller, I started my Ph.D. at […]

The post Melanie Miller says, “As someone who has worked in A.I. for decades, I’ve witnessed the failure of similar predictions of imminent human-level A.I., and I’m certain these latest forecasts will fall short as well. “ appeared first on Statistical Modeling, Causal Inference, and Social Science.

A.I. parity with the West in 2020

Someone just sent me a link to an editorial by Ken Church, in the journal Natural Language Engineering (who knew that journal was still going? I’d have thought open access would’ve killed it). The abstract of Church’s column says of China, There is a bold government plan for AI with specific milestones for parity with […]

The post A.I. parity with the West in 2020 appeared first on Statistical Modeling, Causal Inference, and Social Science.

Three informal case studies: (1) Monte Carlo EM, (2) a new approach to C++ matrix autodiff with closures, (3) C++ serialization via parameter packs

Andrew suggested I cross-post these from the Stan forums to his blog, so here goes. Maximum marginal likelihood and posterior approximations with Monte Carlo expectation maximization: I unpack the goal of max marginal likelihood and approximate Bayes with MMAP and Laplace approximations. I then go through the basic EM algorithm (with a traditional analytic example […]

The post Three informal case studies: (1) Monte Carlo EM, (2) a new approach to C++ matrix autodiff with closures, (3) C++ serialization via parameter packs appeared first on Statistical Modeling, Causal Inference, and Social Science.

Three informal case studies: (1) Monte Carlo EM, (2) a new approach to C++ matrix autodiff with closures, (3) C++ serialization via parameter packs

Andrew suggested I cross-post these from the Stan forums to his blog, so here goes. Maximum marginal likelihood and posterior approximations with Monte Carlo expectation maximization: I unpack the goal of max marginal likelihood and approximate Bayes with MMAP and Laplace approximations. I then go through the basic EM algorithm (with a traditional analytic example […]

The post Three informal case studies: (1) Monte Carlo EM, (2) a new approach to C++ matrix autodiff with closures, (3) C++ serialization via parameter packs appeared first on Statistical Modeling, Causal Inference, and Social Science.

Advice on soft skills for academics

Julia Hirschberg sent this along to the natural language processing mailing list at Columbia: here are some slides from last spring’s CRA-W Grad Cohort and previous years that might be of interest. all sorts of topics such as interviewing, building confidence, finding a thesis topic, preparing your thesis proposal, publishing, entrepreneurialism, and a very interesting […]

The post Advice on soft skills for academics appeared first on Statistical Modeling, Causal Inference, and Social Science.

Advice on “soft skills” for academics

Julia Hirschberg sent this along to the natural language processing mailing list at Columbia: here are some slides from last spring’s CRA-W Grad Cohort and previous years that might be of interest. all sorts of topics such as interviewing, building confidence, finding a thesis topic, preparing your thesis proposal, publishing, entrepreneurialism, and a very interesting […]

The post Advice on “soft skills” for academics appeared first on Statistical Modeling, Causal Inference, and Social Science.

Where do I learn about log_sum_exp, log1p, lccdf, and other numerical analysis tricks?

Richard McElreath inquires: I was helping a colleague recently fix his MATLAB code by using log_sum_exp and log1m tricks. The natural question he had was, “where do you learn this stuff?” I checked Numerical Recipes, but the statistical parts are actually pretty thin (at least in my 1994 edition). Do you know of any books/papers […]

The post Where do I learn about log_sum_exp, log1p, lccdf, and other numerical analysis tricks? appeared first on Statistical Modeling, Causal Inference, and Social Science.