@leninsha

Gradient Omissive Descentis A Minimization Algorithm

, and . International Journal on Soft Computing, Artificial Intelligence and Applications (IJSCAI), 8 (1): 9 (February 2019)
DOI: 10.5121/ijscai.2019.8103

Abstract

This article presents a promising new gradient-based backpropagation algorithm for multi-layer feedforward networks. The method requires no manual selection of global hyperparameters and is capable of dynamic local adaptations using only first-order information at a low computational cost. Its semistochastic nature makes it fit for mini-batch training and robust to different architecture choices and data distributions. Experimental evidence shows that the proposed algorithm improves training in terms of both convergence rate and speed as compared with other well known techniques.

Links and resources

Tags