Hamiltonian Monte Carlo for efficient Gaussian sampling: long and random steps
Simon Apers, Sander Gribling, Dániel Szilágyi; 25(348):1−30, 2024.
Abstract
Hamiltonian Monte Carlo (HMC) is a Markov chain algorithm for sampling from a high-dimensional distribution with density e−f(x), given access to the gradient of f. A particular case of interest is that of a d-dimensional Gaussian distribution with covariance matrix Σ, in which case f(x)=x⊤Σ−1x. We show that Metropolis-adjusted HMC can sample from a distribution that is ε-close to a Gaussian in total variation distance using ˜O(√κd1/4log(1/ε)) gradient queries, where ε>0 and κ is the condition number of Σ. Our algorithm uses long and random integration times for the Hamiltonian dynamics, and it creates a warm start by first running HMC without a Metropolis adjustment. This contrasts with (and was motivated by) recent results that give an ˜Ω(κd1/2) query lower bound for HMC with a fixed integration times or from a cold start, even for the Gaussian case.
© JMLR 2024. (edit, beta) |