From Nesterov's Estimate Sequence to Riemannian Acceleration
Kwangjun Ahn, Suvrit Sra
Subject areas: Non-convex optimization, Approximation algorithms, Convex optimization
Presented in: Session 2A, Session 2C
[Zoom link for poster in Session 2A], [Zoom link for poster in Session 2C]
Abstract:
We propose the first global accelerated gradient method for Riemannian manifolds. Toward establishing our results, we revisit Nesterov's estimate sequence technique and develop a conceptually simple alternative from first principles. We then extend our analysis to Riemannian acceleration, localizing the key difficulty into ``metric distortion.'' We control this distortion via a novel geometric inequality, which enables us to formulate and analyze global Riemannian acceleration.