uasa_a_1909600_sm7248.pdf (1.94 MB)
Download file

Gradient-Based Markov Chain Monte Carlo for Bayesian Inference With Non-differentiable Priors

Download (1.94 MB)
journal contribution
posted on 13.01.2022, 18:23 by Jacob Vorstrup Goldman, Torben Sell, Sumeetpal Sidhu Singh

The use of nondifferentiable priors in Bayesian statistics has become increasingly popular, in particular in Bayesian imaging analysis. Current state-of-the-art methods are approximate in the sense that they replace the posterior with a smooth approximation via Moreau-Yosida envelopes, and apply gradient-based discretized diffusions to sample from the resulting distribution. We characterize the error of the Moreau-Yosida approximation and propose a novel implementation using underdamped Langevin dynamics. In misson-critical cases, however, replacing the posterior with an approximation may not be a viable option. Instead, we show that piecewise-deterministic Markov processes (PDMP) can be used for exact posterior inference from distributions satisfying almost everywhere differentiability. Furthermore, in contrast with diffusion-based methods, the suggested PDMP-based samplers place no assumptions on the prior shape, nor require access to a computationally cheap proximal operator, and consequently have a much broader scope of application. Through detailed numerical examples, including a nondifferentiable circular distribution and a nonconvex genomics model, we elucidate the relative strengths of these sampling methods on problems of moderate to high dimensions, underlining the benefits of PDMP-based methods when accurate sampling is decisive. Supplementary materials for this article are available online.

Funding

JVG acknowledges financial support from an EPSRC Doctoral Training Award. TS acknowledges financial support from the Cantab Capital Institute for the Mathematics of Information.

History