Stochastic optimization with compressed gradients
# 19
Abstract
Topic_abstract: “We consider stochastic optimization over $\ell_p$ spaces using access to a first-order oracle. In the first part of the talk, we ask: What is the minimum precision required for oracle outputs to retain the unrestricted convergence rates? We characterize this precision for every $p\geq 1$ by deriving information theoretic lower bounds and by providing quantizers that (almost) achieve these lower bounds. In the second part of the talk, we completely characterize the precision-convergence trade-off for the Euclidean case. Interestingly, the quantizer designed for this setting, RATQ, (almost) achieves the rate-distortion bounds universally for the well-studied Gaussian rate-distortion problem. This talked is based on joint work with Himanshu Tyagi.”