@kirk86

Iterate averaging as regularization for stochastic gradient descent

, and . (2018)cite arxiv:1802.08009.

Abstract

We propose and analyze a variant of the classic Polyak-Ruppert averaging scheme, broadly used in stochastic gradient methods. Rather than a uniform average of the iterates, we consider a weighted average, with weights decaying in a geometric fashion. In the context of linear least squares regression, we show that this averaging scheme has a the same regularizing effect, and indeed is asymptotically equivalent, to ridge regression. In particular, we derive finite-sample bounds for the proposed approach that match the best known results for regularized stochastic gradient methods.

Description

[1802.08009] Iterate averaging as regularization for stochastic gradient descent

Links and resources

Tags

community

  • @kirk86
  • @dblp
@kirk86's tags highlighted