The global rate of convergence for optimal tensor methods in smooth
convex optimization
A. Gasnikov, P. Dvurechensky, E. Gorbunov, E. Vorontsova, D. Selikhanovych, und C. Uribe. (2018)cite arxiv:1809.00382Comment: In the current version we present a translation into English of the main derivations, which first appeared on September 2, 2018 in Russian, extend the analysis from the case of strongly convex objective to the case of uniformly convex objectives and add the numerical analysis of our results.
Zusammenfassung
We consider convex optimization problems with the objective function having
Lipshitz-continuous $p$-th order derivative, where $p1$. We propose a new
tensor method, which closes the gap between the lower
$Ołeft(\varepsilon^-23p+1 \right)$ and upper
$Ołeft(\varepsilon^-1p+1 \right)$ iteration complexity bounds for
this class of optimization problems.
We also consider uniformly convex functions, and show how the proposed method
can be accelerated under this additional assumption. Moreover, we introduce a
$p$-th order condition number which naturally arises in the complexity analysis
of tensor methods under this assumption.
Finally, we make a numerical study of the proposed optimal method and show
that in practice it is faster than the best known accelerated tensor method. We
also compare the performance of tensor methods for $p=2$ and $p=3$ and show
that the 3rd-order method is superior to the 2nd-order method in practice.
Beschreibung
[1809.00382] The global rate of convergence for optimal tensor methods in smooth convex optimization
cite arxiv:1809.00382Comment: In the current version we present a translation into English of the main derivations, which first appeared on September 2, 2018 in Russian, extend the analysis from the case of strongly convex objective to the case of uniformly convex objectives and add the numerical analysis of our results
%0 Journal Article
%1 gasnikov2018global
%A Gasnikov, Alexander
%A Dvurechensky, Pavel
%A Gorbunov, Eduard
%A Vorontsova, Evgeniya
%A Selikhanovych, Daniil
%A Uribe, César A.
%D 2018
%K convergence convex learning
%T The global rate of convergence for optimal tensor methods in smooth
convex optimization
%U http://arxiv.org/abs/1809.00382
%X We consider convex optimization problems with the objective function having
Lipshitz-continuous $p$-th order derivative, where $p1$. We propose a new
tensor method, which closes the gap between the lower
$Ołeft(\varepsilon^-23p+1 \right)$ and upper
$Ołeft(\varepsilon^-1p+1 \right)$ iteration complexity bounds for
this class of optimization problems.
We also consider uniformly convex functions, and show how the proposed method
can be accelerated under this additional assumption. Moreover, we introduce a
$p$-th order condition number which naturally arises in the complexity analysis
of tensor methods under this assumption.
Finally, we make a numerical study of the proposed optimal method and show
that in practice it is faster than the best known accelerated tensor method. We
also compare the performance of tensor methods for $p=2$ and $p=3$ and show
that the 3rd-order method is superior to the 2nd-order method in practice.
@article{gasnikov2018global,
abstract = {We consider convex optimization problems with the objective function having
Lipshitz-continuous $p$-th order derivative, where $p\geq 1$. We propose a new
tensor method, which closes the gap between the lower
$O\left(\varepsilon^{-\frac{2}{3p+1}} \right)$ and upper
$O\left(\varepsilon^{-\frac{1}{p+1}} \right)$ iteration complexity bounds for
this class of optimization problems.
We also consider uniformly convex functions, and show how the proposed method
can be accelerated under this additional assumption. Moreover, we introduce a
$p$-th order condition number which naturally arises in the complexity analysis
of tensor methods under this assumption.
Finally, we make a numerical study of the proposed optimal method and show
that in practice it is faster than the best known accelerated tensor method. We
also compare the performance of tensor methods for $p=2$ and $p=3$ and show
that the 3rd-order method is superior to the 2nd-order method in practice.},
added-at = {2019-02-28T21:09:43.000+0100},
author = {Gasnikov, Alexander and Dvurechensky, Pavel and Gorbunov, Eduard and Vorontsova, Evgeniya and Selikhanovych, Daniil and Uribe, César A.},
biburl = {https://www.bibsonomy.org/bibtex/26188e056a94cd69f37af2be0e59bed6f/kirk86},
description = {[1809.00382] The global rate of convergence for optimal tensor methods in smooth convex optimization},
interhash = {93197cdc6dc59c51e4e0befc396551ef},
intrahash = {6188e056a94cd69f37af2be0e59bed6f},
keywords = {convergence convex learning},
note = {cite arxiv:1809.00382Comment: In the current version we present a translation into English of the main derivations, which first appeared on September 2, 2018 in Russian, extend the analysis from the case of strongly convex objective to the case of uniformly convex objectives and add the numerical analysis of our results},
timestamp = {2019-02-28T21:09:43.000+0100},
title = {The global rate of convergence for optimal tensor methods in smooth
convex optimization},
url = {http://arxiv.org/abs/1809.00382},
year = 2018
}