Optimal dimension dependence of the Metropolis-Adjusted Langevin Algorithm
Sinho Chewi , Chen Lu , Kwangjun Ahn , Xiang Cheng , Thibaut Le Gouic , Philippe Rigollet
Session: Sampling Algorithms
Session Chair: Jonathan Niles-Weed
Poster: Poster Session 4
Abstract:
Conventional wisdom in the sampling literature, backed by a popular diffusion scaling limit, suggests that the mixing time of the Metropolis-Adjusted Langevin Algorithm (MALA) scales as O(d^{1/3}), where d is the dimension. However, the diffusion scaling limit requires stringent assumptions on the target distribution and is asymptotic in nature. In contrast, the best known non-asymptotic mixing time bound for MALA on the class of log-smooth and strongly log-concave distributions is O(d). In this work, we establish that the mixing time of MALA on this class of target distributions is \tilde\Theta(d^{1/2}) under a warm start.
Our upper bound proof introduces a new technique based on a projection characterization of the Metropolis adjustment which reduces the study of MALA to the well-studied discretization analysis of the Langevin SDE and bypasses direct computation of the acceptance probability.