Gradient Omissive Descentis A Minimization Algorithm
Gradient Omissive Descentis A Minimization Algorithm
Gustavo A. Lado and Enrique C. Segura
Universidad de Buenos Aires Cyprus
ABSTRACT
This article presents a promising new gradient-based backpropagation algorithm for multi-layer feedforward networks. The method requires no manual selection of global hyperparameters and is capable of dynamic local adaptations using only first-order information at a low computational cost. Its semi-stochastic nature makes it fit for mini-batch training and robust to different architecture choices and data distributions. Experimental evidence shows that the proposed algorithm improves training in terms of both convergence rate and speed as compared with other well known techniques.
KEYWORDS
Neural Networks, Backpropagation, Cost Function Minimization Original Source URL: http://aircconline.com/ijscai/V8N1/8119ijscai03.pdf
http://airccse.org/journal/ijscai/current2019.html
Comments
Post a Comment