Here are my slides [more or less] for the introductory overview lecture I am giving today at JSM 2019, 4:00-5:50, CC-Four Seasons I. There is obviously quite an overlap with earlier courses I gave on the topic, although I refrained here from mentioning any specific application (like population genetics) to focus on statistical and computational […]
Category: ABC
off to Denver! [JSM2019]
As off today, I am attending JSM 2019 in Denver, giving an “Introductory Overview Lecture” on The ABC of Approximate Bayesian Computation on Sunday afternoon and chairing an ABC session on Monday morning. As far as I know these are the only ABC sessions at JSM this year… And hence the only sessions I will […]
uncertainty in the ABC posterior
In the most recent Bayesian Analysis, Marko Järvenpää et al. (including my coauthor Aki Vehtari) consider an ABC setting where the number of available simulations of pseudo-samples is limited. And where they want to quantify the amount of uncertainty resulting from the estimation of the ABC posterior density. Which is a version of the Monte […]
locusts in a random forest
My friends from Montpellier, where I am visiting today, Arnaud Estoup, Jean-Michel Marin, and Louis Raynal, along with their co-authors, have recently posted on biorXiv a paper using ABC-RF (Random Forests) to analyse the divergence of two populations of desert locusts in Africa. (I actually first heard of their paper by an unsolicited email from […]
ateliers statistiques bayésiens
The French Statistical Association is running a training workshop on practical computational Bayesian methods on 10-12 September 2019 in Paris (IHP), animated by Sylvain LE CORFF (Telecom SudParis – Institut Polytechnique de Paris) for the initia…
likelihood-free approximate Gibbs sampling
“Low-dimensional regression-based models are constructed for each of these conditional distributions using synthetic (simulated) parameter value and summary statistic pairs, which then permit approximate Gibbs update steps (…) synthetic datasets are not generated during each sampler iteration, thereby providing efficiencies for expensive simulator models, and only require sufficient synthetic datasets to adequately construct the full […]
talk at CISEA 2019
Here are my slides for the overview talk I am giving at CISEA 2019, in Abidjan, highly resemblant with earlier talks, except for the second slide!
A precursor of ABC-Gibbs
Following our arXival of ABC-Gibbs, Dennis Prangle pointed out to us a 2016 paper by Athanasios Kousathanas, Christoph Leuenberger, Jonas Helfer, Mathieu Quinodoz, Matthieu Foll, and Daniel Wegmann, Likelihood-Free Inference in High-Dimensional Model, published in Genetics, Vol. 203, 893–904 in June 2016. This paper contains a version of ABC Gibbs where parameters are sequentially simulated […]
ABC with Gibbs steps
With Grégoire Clarté, Robin Ryder and Julien Stoehr, all from Paris-Dauphine, we have just arXived a paper on the specifics of ABC-Gibbs, which is a version of ABC where the generic ABC accept-reject step is replaced by a sequence of n conditional ABC accept-reject steps, each aiming at an ABC version of a conditional distribution […]
postdoc position still open
The post-doctoral position supported by the ANR funding of our Paris-Saclay-Montpellier research conglomerate on approximate Bayesian inference and computation remains open for the time being. We are more particularly looking for candidates with a strong background in mathematical statistics, esp. Bayesian non-parametrics, towards the analysis of the limiting behaviour of approximate Bayesian inference. Candidates should […]
selecting summary statistics [a tale of two distances]
As Jonathan Harrison came to give a seminar in Warwick [which I could not attend], it made me aware of his paper with Ruth Baker on the selection of summaries in ABC. The setting is an ABC-SMC algorithm and it relates with Fearnhead and Prangle (2012), Barnes et al. (2012), our own random forest approach, […]
ABC in Grenoble, 19-20 March 2020
The next occurrence of the “ABC in…” workshops will take place in Grenoble, France, on 19-20 March 2020. Both local organising and international scientific committees have been constituted and the program should soon be constructed, along with calls to contributions launched at the same time. As in most earlier versions of the workshops (ABC in […]
robust Bayesian synthetic likelihood
David Frazier (Monash University) and Chris Drovandi (QUT) have recently come up with a robustness study of Bayesian synthetic likelihood that somehow mirrors our own work with David. In a sense, Bayesian synthetic likelihood is definitely misspecified from the start in assuming a Normal distribution on the summary statistics. When the data generating process is […]
the true meaning of ABC
MCMC importance samplers for intractable likelihoods
Jordan Franks just posted on arXiv his PhD dissertation at the University of Jyväskylä, where he discuses several of his works: M. Vihola, J. Helske, and J. Franks. Importance sampling type estimators based on approximate marginal MCMC. Preprint arXiv:1609.02541v5, 2016. J. Franks and M. Vihola. Importance sampling correction versus standard averages of reversible MCMCs in […]
over-confident about mis-specified models?
Ziheng Yang and Tianqui Zhu published a paper in PNAS last year that criticises Bayesian posterior probabilities used in the comparison of models under misspecification as “overconfident”. The paper is written from a phylogeneticist point of view, rather than from a statistician’s perspective, as shown by the Editor in charge of the paper [although I […]
over-confident about mis-specified models?
Ziheng Yang and Tianqui Zhu published a paper in PNAS last year that criticises Bayesian posterior probabilities used in the comparison of models under misspecification as “overconfident”. The paper is written from a phylogeneticist point of view, rather than from a statistician’s perspective, as shown by the Editor in charge of the paper [although I […]
holistic framework for ABC
An AISTATS 2019 paper was recently arXived by Kelvin Hsu and Fabio Ramos. Proposing an ABC method “…consisting of (1) a consistent surrogate likelihood model that modularizes queries from simulation calls, (2) a Bayesian learning objective for hyperparameters that improves inference accuracy, and (3) a posterior surrogate density and a super-sampling inference algorithm using its […]
adaptive copulas for ABC
A paper on ABC I read on my way back from Cambodia: Yanzhi Chen and Michael Gutmann arXived an ABC [in Edinburgh] paper on learning the target via Gaussian copulas, to be presented at AISTATS this year (in Okinawa!). Linking post-processing (regression) ABC and sequential ABC. The drawback in the regression approach is that the […]
asymptotics of synthetic likelihood [a reply from the authors]
[Here is a reply from David, Chris, and Robert on my earlier comments, highlighting some points I had missed or misunderstood.] Dear Christian Thanks for your interest in our synthetic likelihood paper and the thoughtful comments you wrote about it on your blog. We’d like to respond to the comments to avoid some misconceptions. Your […]