“Introducing a sparsity prior avoids overfitting the number of clusters not only for finite mixtures, but also (somewhat unexpectedly) for Dirichlet process mixtures which are known to overfit the number of clusters.” On my way back from Clermont-Ferrand, in an old train that reminded me of my previous ride on that line that took place […]
Category: University of Warwick
Bayesian webinar: Bayesian conjugate gradient
Bayesian Analysis is launching its webinar series on discussion papers! Meaning the first 90 registrants will be able to participate interactively via the Zoom Conference platform while additional registrants will be able to view the Webinar on a dedicated YouTube Channel. This fantastic initiative is starting with the Bayesian conjugate gradient method of Jon Cockayne […]
unimaginable scale culling
Despite the evidence brought by ABC on the inefficiency of culling in massive proportions the British Isles badger population against bovine tuberculosis, the [sorry excuse for a] United Kingdom government has permitted a massive expansion of badger culling, with up to 64,000 animals likely to be killed this autumn… Since the cows are the primary […]
likelihood-free inference by ratio estimation
“This approach for posterior estimation with generative models mirrors the approach of Gutmann and Hyvärinen (2012) for the estimation of unnormalised models. The main difference is that here we classify between two simulated data sets while Gutmann and Hyvärinen (2012) classified between the observed data and simulated reference data.” A 2018 arXiv posting by Owen […]
unbiased product of expectations
While I was not involved in any way, or even aware of this research, Anthony Lee, Simone Tiberi, and Giacomo Zanella have an incoming paper in Biometrika, and which was partly written while all three authors were at the University of Warwick. The purpose is to design an efficient manner to approximate the product of […]
a generalized representation of Bayesian inference.
Jeremias Knoblauch, Jack Jewson and Theodoros Damoulas, all affiliated with Warwick (hence a potentially biased reading!), arXived a paper on loss-based Bayesian inference that Jack discussed with me on my last visit to Warwick. As I was somewhat scared by the 61 pages, of which the 8 first pages are in NeurIPS style. The authors […]
O’Bayes 19/4
Last talks of the conference! With Rui Paulo (along with Gonzalo Garcia-Donato) considering the special case of factors when doing variable selection. Which is an interesting question that I had never considered, as at best I would remove all leves or keeping them all. Except that there may be misspecification in the factors as for […]
O’Bayes 19/3.5
Among the posters at the second poster session yesterday night, one by Judith ter Schure visually standing out by following the #betterposter design suggested by Mike Morrison a few months ago. Design on which I have ambivalent feelings. On the one hand, reducing the material on a poster is generally a good idea as […]
O’Bayes 19/3
Nancy Reid gave the first talk of the [Canada] day, in an impressive comparison of all approaches in statistics that involve a distribution of sorts on the parameter, connected with the presentation she gave at BFF4 in Harvard two years ago, including safe Bayes options this time. This was related to several (most?) of the […]
O’Bayes 19/2
One talk on Day 2 of O’Bayes 2019 was by Ryan Martin on data dependent priors (or “priors”). Which I have already discussed in this blog. Including the notion of a Gibbs posterior about quantities that “are not always defined through a model” [which is debatable if one sees it like part of a semi-parametric […]
O’Bayes 19/1 [snapshots]
Although the tutorials of O’Bayes 2019 of yesterday were poorly attended, albeit them being great entries into objective Bayesian model choice, recent advances in MCMC methodology, and the multiple layers of BART, for which I have to blame myself for sticking the beginning of O’Bayes too closely to the end of BNP as only the […]
O’Bayes 19/1 [snapshots]
Although the tutorials of O’Bayes 2019 of yesterday were poorly attended, albeit them being great entries into objective Bayesian model choice, recent advances in MCMC methodology, and the multiple layers of BART, for which I have to blame myself for sticking the beginning of O’Bayes too closely to the end of BNP as only the […]
O’Bayes 2019 has now started!
The O’Bayes 2019 conference in Warwick University has now started, with about 100 participants meeting over four days (plus one of tutorials) in the Zeeman maths building of the University. Quite a change of location and weather when compared with the previous one in Austin. As an organiser I hope all goes well at the […]
Bayesian conjugate gradients [open for discussion]
When fishing for an illustration for this post on Google, I came upon this Bayesian methods for hackers cover, a book about which I have no clue whatsoever (!) but that mentions probabilistic programming. Which serves as a perfect (?!) introduction to the call for discussion in Bayesian Analysis of the incoming Bayesian conjugate gradient […]
skipping sampler
“The Skipping Sampler is an adaptation of the MH algorithm designed to sample from targets which have areas of zero density. It ‘skips’ across such areas, much as a flat stone can skip or skim repeatedly across the surface of water.” An interesting challenge is simulating from a density restricted to a set C when […]
selecting summary statistics [a tale of two distances]
As Jonathan Harrison came to give a seminar in Warwick [which I could not attend], it made me aware of his paper with Ruth Baker on the selection of summaries in ABC. The setting is an ABC-SMC algorithm and it relates with Fearnhead and Prangle (2012), Barnes et al. (2012), our own random forest approach, […]
O’Bayes 2019 conference program
The full and definitive program of the O’Bayes 2019 conference in Warwick is now on line. Including discussants for all papers. And the three [and free] tutorials on Friday afternoon, 28 June, on model selection (M. Barbieri), MCMC recent advances (G.O. Roberts) and BART (E.I. George). Registration remains open at the reduced rate and submissions […]
PhD studenships at Warwick
There is an exciting opening for several PhD positions at Warwick, in the departments of Statistics and of Mathematics, as part of the Centre for Doctoral Training in Mathematics and Statistics newly created by the University. CDT studentships are funded for four years and funding is open to students from the European Union without restrictions. […]
Stein’s method in machine learning [workshop]
There will be an ICML workshop on Stein’s method in machine learning & statistics, next July 14 or 15, located in Long Beach, CA. Organised by François-Xavier Briol (formerly Warwick), Lester Mckey, Chris Oates (formerly Warwick), Qiang Liu, and Larry Golstein. To quote from the webpage of the workshop Stein’s method is a technique from […]
Stein’s method in machine learning [workshop]
There will be an ICML workshop on Stein’s method in machine learning & statistics, next July 14 or 15, located in Long Beach, CA. Organised by François-Xavier Briol (formerly Warwick), Lester Mckey, Chris Oates (formerly Warwick), Qiang Liu, and Larry Golstein. To quote from the webpage of the workshop Stein’s method is a technique from […]