Fixup Initialization: Residual Learning Without Normalization
H. Zhang, Y. Dauphin, and T. Ma. (2019)cite arxiv:1901.09321Comment: Accepted for publication at ICLR 2019; see https://openreview.net/forum?id=H1gsz30cKX.
Abstract
Normalization layers are a staple in state-of-the-art deep neural network
architectures. They are widely believed to stabilize training, enable higher
learning rate, accelerate convergence and improve generalization, though the
reason for their effectiveness is still an active research topic. In this work,
we challenge the commonly-held beliefs by showing that none of the perceived
benefits is unique to normalization. Specifically, we propose fixed-update
initialization (Fixup), an initialization motivated by solving the exploding
and vanishing gradient problem at the beginning of training via properly
rescaling a standard initialization. We find training residual networks with
Fixup to be as stable as training with normalization -- even for networks with
10,000 layers. Furthermore, with proper regularization, Fixup enables residual
networks without normalization to achieve state-of-the-art performance in image
classification and machine translation.
Description
Fixup Initialization: Residual Learning Without Normalization
%0 Generic
%1 zhang2019fixup
%A Zhang, Hongyi
%A Dauphin, Yann N.
%A Ma, Tengyu
%D 2019
%K dl resnet
%T Fixup Initialization: Residual Learning Without Normalization
%U http://arxiv.org/abs/1901.09321
%X Normalization layers are a staple in state-of-the-art deep neural network
architectures. They are widely believed to stabilize training, enable higher
learning rate, accelerate convergence and improve generalization, though the
reason for their effectiveness is still an active research topic. In this work,
we challenge the commonly-held beliefs by showing that none of the perceived
benefits is unique to normalization. Specifically, we propose fixed-update
initialization (Fixup), an initialization motivated by solving the exploding
and vanishing gradient problem at the beginning of training via properly
rescaling a standard initialization. We find training residual networks with
Fixup to be as stable as training with normalization -- even for networks with
10,000 layers. Furthermore, with proper regularization, Fixup enables residual
networks without normalization to achieve state-of-the-art performance in image
classification and machine translation.
@misc{zhang2019fixup,
abstract = {Normalization layers are a staple in state-of-the-art deep neural network
architectures. They are widely believed to stabilize training, enable higher
learning rate, accelerate convergence and improve generalization, though the
reason for their effectiveness is still an active research topic. In this work,
we challenge the commonly-held beliefs by showing that none of the perceived
benefits is unique to normalization. Specifically, we propose fixed-update
initialization (Fixup), an initialization motivated by solving the exploding
and vanishing gradient problem at the beginning of training via properly
rescaling a standard initialization. We find training residual networks with
Fixup to be as stable as training with normalization -- even for networks with
10,000 layers. Furthermore, with proper regularization, Fixup enables residual
networks without normalization to achieve state-of-the-art performance in image
classification and machine translation.},
added-at = {2019-02-03T17:03:52.000+0100},
author = {Zhang, Hongyi and Dauphin, Yann N. and Ma, Tengyu},
biburl = {https://www.bibsonomy.org/bibtex/201a8f7e73bb11d105ef118ecb93ba13b/bechr7},
description = {Fixup Initialization: Residual Learning Without Normalization},
interhash = {49a7eba7572818fbcfb3bb0bb9f0a925},
intrahash = {01a8f7e73bb11d105ef118ecb93ba13b},
keywords = {dl resnet},
note = {cite arxiv:1901.09321Comment: Accepted for publication at ICLR 2019; see https://openreview.net/forum?id=H1gsz30cKX},
timestamp = {2019-02-03T17:03:52.000+0100},
title = {Fixup Initialization: Residual Learning Without Normalization},
url = {http://arxiv.org/abs/1901.09321},
year = 2019
}