Posts Tagged ‘ books ’

Le Monde puzzle [#1036]

January 3, 2018
By
Le Monde puzzle [#1036]

An arithmetic Le Monde mathematical puzzle to conclude 2017: Find (a¹,…,a¹³), a permutation of (1,…,13) such that a¹/a²+a³=a²+a³/a³+a⁴+a⁵=b¹<1 a⁶/a⁶+a⁷=a⁶+a⁷/a⁷+a⁸+a⁹=a⁷+a⁸+a⁹/a⁵+a⁹+a¹⁰=b²<1 a¹¹+a¹²/a¹²+a¹³=a¹²+a¹³/a¹³+a¹⁰=b³<1 The question can be solved by brute force simulation, checking all possible permutations of (1,…,13). But 13! is 6.6 trillion, a wee bit too many cases. Despite the problem being made of only four constraints […]

Read more »

correlation for maximal coupling

January 2, 2018
By
correlation for maximal coupling

An interesting (if vaguely formulated) question on X validated: given two Gaussian variates that are maximally coupled, what is the correlation between these variates? The answer depends on the parameters of both Gaussian, with a correlation of one when both Gaussians are identical. Answering the question by simulation (as I could not figure out the […]

Read more »

random wake

December 26, 2017
By
random wake

Just too often on X validated, one sees questions displaying a complete ignorance of the basics that makes one purposelessly wonder what is the point of trying to implement advanced methods when missing the necessary background. And just as often, I reacted to the question by wondering out loud about this… In the current case, […]

Read more »

Le Monde puzzle [#1033]

December 18, 2017
By
Le Monde puzzle [#1033]

A simple Le Monde mathematical puzzle after two geometric ones I did not consider: Bob gets a 2×3 card with three integer entries on the first row and two integer entries on the second row such that (i) entry (1,1) is 1, (ii) summing up subsets of adjacent entries produces all integers from 1 to […]

Read more »

sliced Wasserstein estimation of mixtures

November 27, 2017
By
sliced Wasserstein estimation of mixtures

A paper by Soheil Kolouri and co-authors was arXived last week about using Wasserstein distance for inference on multivariate Gaussian mixtures. The basic concept is that the parameter is estimated by minimising the p-Wasserstein distance to the empirical distribution, smoothed by a Normal kernel. As the general Wasserstein distance is quite costly to compute, the […]

Read more »

Le Monde puzzle [#1029]

November 21, 2017
By
Le Monde puzzle [#1029]

A convoluted counting Le Monde mathematical puzzle: A film theatre has a waiting room and several projection rooms. With four films on display. A first set of 600 spectators enters the waiting room and vote for their favourite film. The most popular film is projected to the spectators who voted for it and the remaining […]

Read more »

normal variates in Metropolis step

November 13, 2017
By
normal variates in Metropolis step

A definitely puzzled participant on X validated, confusing the Normal variate or variable used in the random walk Metropolis-Hastings step with its Normal density… It took some cumulated efforts to point out the distinction. Especially as the originator of the question had a rather strong a priori about his or her background: “I take issue […]

Read more »

Le Monde [last] puzzle [#1026]

November 1, 2017
By
Le Monde [last] puzzle [#1026]

The last and final Le Monde puzzle is a bit of a disappointment, to wit: A 4×4 table is filled with positive and different integers. A 3×3 table is then deduced by adding four adjacent [i.e. sharing a common corner] entries of the original table. Similarly with a 2×2 table, summing up to a unique […]

Read more »

computational methods for numerical analysis with R [book review]

October 30, 2017
By
computational methods for numerical analysis with R [book review]

This is a book by James P. Howard, II, I received from CRC Press for review in CHANCE. (As usual, the customary warning applies: most of this blog post will appear later in my book review column in CHANCE.) It consists in a traditional introduction to numerical analysis with backup from R codes and packages. […]

Read more »

mea culpa!

October 8, 2017
By
mea culpa!

An entry about our Bayesian Essentials book on X validated alerted me to a typo in the derivation of the Gaussian posterior..! When deriving the posterior (which was left as an exercise in the Bayesian Core), I just forgot the term expressing the divergence between the prior mean and the sample mean. Mea culpa!!!Filed under: […]

Read more »


Subscribe

Email:

  Subscribe