Lyddon, S. P.; Holmes, C. C.; Walker, S. G. General Bayesian updating and the loss-likelihood bootstrap. (English) Zbl 1454.62098 Biometrika 106, No. 2, 465-478 (2019). The weighted likelihood bootstrap is a method that generates samples from an approximate Bayesian posterior of a parametric model. It is shown in the paper under review that the same method can be derived, without approximation, under a Bayesian nonparametric model with the parameter of interest defined through minimizing expected negative log likelihood under an unknown sampling distribution. This allows extending the weighted likelihood bootstrap to posterior sampling for parameters minimizing an expected loss. The authors call this method the loss-likelihood bootstrap, and make a connection between it and general Bayesian updating (i.e. a way of updating prior belief distributions that does not need the construction of a global probability model, yet requires the calibration of two forms of loss function). Next, the loss-likelihood bootstrap is utilized in the paper to calibrate the general Bayesian posterior by matching asymptotic Fisher information. Finally, the proposed method is illustrated on a number of examples. Reviewer: Joseph Melamed (Los Angeles) Cited in 15 Documents MSC: 62F15 Bayesian inference 62F40 Bootstrap, jackknife and other resampling methods 62B10 Statistical aspects of information-theoretic topics Keywords:Bayesian bootstrap; Fisher information; general Bayesian updating; loss function; loss-likelihood bootstrap; model misspecification; weighted likelihood bootstrap PDFBibTeX XMLCite \textit{S. P. Lyddon} et al., Biometrika 106, No. 2, 465--478 (2019; Zbl 1454.62098) Full Text: DOI arXiv