Putting An End to End-to-End: Gradient-Isolated Learning of
Representations
S. Löwe, P. O'Connor, and B. Veeling. (2019)cite arxiv:1905.11786Comment: Honorable Mention for Outstanding New Directions Paper Award at NeurIPS 2019.
Abstract
We propose a novel deep learning method for local self-supervised
representation learning that does not require labels nor end-to-end
backpropagation but exploits the natural order in data instead. Inspired by the
observation that biological neural networks appear to learn without
backpropagating a global error signal, we split a deep neural network into a
stack of gradient-isolated modules. Each module is trained to maximally
preserve the information of its inputs using the InfoNCE bound from Oord et al.
2018. Despite this greedy training, we demonstrate that each module improves
upon the output of its predecessor, and that the representations created by the
top module yield highly competitive results on downstream classification tasks
in the audio and visual domain. The proposal enables optimizing modules
asynchronously, allowing large-scale distributed training of very deep neural
networks on unlabelled datasets.
Description
Putting An End to End-to-End: Gradient-Isolated Learning of Representations
%0 Generic
%1 lowe2019putting
%A Löwe, Sindy
%A O'Connor, Peter
%A Veeling, Bastiaan S.
%D 2019
%K deeplearning optimization training
%T Putting An End to End-to-End: Gradient-Isolated Learning of
Representations
%U http://arxiv.org/abs/1905.11786
%X We propose a novel deep learning method for local self-supervised
representation learning that does not require labels nor end-to-end
backpropagation but exploits the natural order in data instead. Inspired by the
observation that biological neural networks appear to learn without
backpropagating a global error signal, we split a deep neural network into a
stack of gradient-isolated modules. Each module is trained to maximally
preserve the information of its inputs using the InfoNCE bound from Oord et al.
2018. Despite this greedy training, we demonstrate that each module improves
upon the output of its predecessor, and that the representations created by the
top module yield highly competitive results on downstream classification tasks
in the audio and visual domain. The proposal enables optimizing modules
asynchronously, allowing large-scale distributed training of very deep neural
networks on unlabelled datasets.
@misc{lowe2019putting,
abstract = {We propose a novel deep learning method for local self-supervised
representation learning that does not require labels nor end-to-end
backpropagation but exploits the natural order in data instead. Inspired by the
observation that biological neural networks appear to learn without
backpropagating a global error signal, we split a deep neural network into a
stack of gradient-isolated modules. Each module is trained to maximally
preserve the information of its inputs using the InfoNCE bound from Oord et al.
[2018]. Despite this greedy training, we demonstrate that each module improves
upon the output of its predecessor, and that the representations created by the
top module yield highly competitive results on downstream classification tasks
in the audio and visual domain. The proposal enables optimizing modules
asynchronously, allowing large-scale distributed training of very deep neural
networks on unlabelled datasets.},
added-at = {2021-06-15T09:31:58.000+0200},
author = {Löwe, Sindy and O'Connor, Peter and Veeling, Bastiaan S.},
biburl = {https://www.bibsonomy.org/bibtex/2409834b891c94406072d4cf501c5353e/annakrause},
description = {Putting An End to End-to-End: Gradient-Isolated Learning of Representations},
interhash = {0c564080ed4e6880520bbbc376b55b8a},
intrahash = {409834b891c94406072d4cf501c5353e},
keywords = {deeplearning optimization training},
note = {cite arxiv:1905.11786Comment: Honorable Mention for Outstanding New Directions Paper Award at NeurIPS 2019},
timestamp = {2021-06-15T09:31:58.000+0200},
title = {Putting An End to End-to-End: Gradient-Isolated Learning of
Representations},
url = {http://arxiv.org/abs/1905.11786},
year = 2019
}