@kirk86

On the distance between two neural networks and the stability of learning

, , , and . (2020)cite arxiv:2002.03432.

Abstract

This paper relates parameter distance to gradient breakdown for a broad class of nonlinear compositional functions. The analysis leads to a new distance function called deep relative trust and a descent lemma for neural networks. Since the resulting learning rule seems not to require learning rate grid search, it may unlock a simpler workflow for training deeper and more complex neural networks. Please find the Python code used in this paper here: https://github.com/jxbz/fromage.

Description

[2002.03432] On the distance between two neural networks and the stability of learning

Links and resources

Tags

community

  • @kirk86
  • @dblp
@kirk86's tags highlighted