Article,

Dual Space Preconditioning for Gradient Descent

, , , and .
(2019)cite arxiv:1902.02257Comment: Major revision, including simpler equivalent conditions for dual relative smoothness and applications to exponential penalty functions and p-norm regression.

Abstract

The conditions of relative smoothness and relative strong convexity were recently introduced for the analysis of Bregman gradient methods for convex optimization. We introduce a fully explicit descent scheme with relative smoothness in the dual space between the convex conjugate of the objective function and a designed dual reference function. For Legendre type convex functions under this dual relative smoothness, our scheme naturally remains in the interior of the domain, despite being fully explicit. We obtain linear convergence under dual relative strong convexity with a condition number that is invariant under horizontal translations. Our method is a non-linear preconditioning of gradient descent that can improve the conditioning of explicit first-order methods on problems with non-smooth or non-strongly convex structure. We show how this method can be applied to p-norm regression and exponential penalty function minimization.

Tags

Users

  • @kirk86

Comments and Reviews