The Gradient Complexity of Linear Regression
Mark Braverman, Elad Hazan, Max Simchowitz, Blake E Woodworth
Subject areas: Randomized linear algebraic methods and sketching, Convex optimization
Presented in: Session 3B, Session 3D
[Zoom link for poster in Session 3B], [Zoom link for poster in Session 3D]
Abstract:
We investigate the computational complexity of several basic linear algebra primitives, including largest eigenvector computation and linear regression, in the computational model that allows access to the data via a matrix-vector product oracle. We show that for polynomial accuracy, $\Theta(d)$ calls to the oracle are necessary and sufficient even for a randomized algorithm. \n\nOur lower bound is based on a reduction to estimating the least eigenvalue of a random Wishart matrix. This simple distribution enables a concise proof, leveraging a few key properties of the random Wishart ensemble.