Loading [MathJax]/jax/output/HTML-CSS/jax.js

Hamiltonian Monte Carlo for efficient Gaussian sampling: long and random steps

Simon Apers, Sander Gribling, Dániel Szilágyi.

Year: 2024, Volume: 25, Issue: 348, Pages: 1−30


Abstract

Hamiltonian Monte Carlo (HMC) is a Markov chain algorithm for sampling from a high-dimensional distribution with density ef(x), given access to the gradient of f. A particular case of interest is that of a d-dimensional Gaussian distribution with covariance matrix Σ, in which case f(x)=xΣ1x. We show that Metropolis-adjusted HMC can sample from a distribution that is ε-close to a Gaussian in total variation distance using ˜O(κd1/4log(1/ε)) gradient queries, where ε>0 and κ is the condition number of Σ. Our algorithm uses long and random integration times for the Hamiltonian dynamics, and it creates a warm start by first running HMC without a Metropolis adjustment. This contrasts with (and was motivated by) recent results that give an ˜Ω(κd1/2) query lower bound for HMC with a fixed integration times or from a cold start, even for the Gaussian case.

PDF BibTeX