Abstract
The conditions of relative smoothness and relative strong convexity were
recently introduced for the analysis of Bregman gradient methods for convex
optimization. We introduce a fully explicit descent scheme with relative
smoothness in the dual space between the convex conjugate of the objective
function and a designed dual reference function. For Legendre type convex
functions under this dual relative smoothness, our scheme naturally remains in
the interior of the domain, despite being fully explicit. We obtain linear
convergence under dual relative strong convexity with a condition number that
is invariant under horizontal translations. Our method is a non-linear
preconditioning of gradient descent that can improve the conditioning of
explicit first-order methods on problems with non-smooth or non-strongly convex
structure. We show how this method can be applied to p-norm regression and
exponential penalty function minimization.
Users
Please
log in to take part in the discussion (add own reviews or comments).