ICC   25427
INSTITUTO DE INVESTIGACION EN CIENCIAS DE LA COMPUTACION
Unidad Ejecutora - UE
artículos
Título:
GRADIENT OMISSIVE DESCENT IS A MINIMIZATION ALGORITHM
Autor/es:
SEGURA, ENRIQUE CARLOS; LADO, GUSTAVO
Revista:
International Journal on Soft Computing, Artificial Intelligence and Applications
Editorial:
AIRCC Publishing Corporation
Referencias:
Año: 2019 vol. 8 p. 37 - 45
ISSN:
2319-4081
Resumen:
This article presents a promising new gradient-based backpropagation algorithm for multi-layer feedforward networks. The method requires no manual selection of global hyperparameters and is capable of dynamic local adaptations using only first-order information at a low computational cost. Its semi-stochastic nature makes it fit for mini-batch training and robust to different architecture choices and data distributions. Experimental evidence shows that the proposed algorithm improves training in terms of both convergence rate and speed as compared with other well known techniques.