Adaptive gradient descent optimization algorithm with correction term
The batch stochastic gradient descent(SGD)optimization algorithm was commonly used for training convolutional neural networks(CNNs),and its performance directly affected the convergence speed of the neural network.In recent years,some adaptive gradient descent optimization algorithms had been proposed,such as the Adam algorithm and Radam algorithm.However,these optimization algorithms neither utilized the gradient norms of historical iterations nor utilize the second moment of gradients in random subsample.These factors led to slow convergence speed and unstable performance of adaptive gradient descent optimization algorithms.In this paper,a new adaptive gradient descent optimization algorithm called normEve was proposed that combined historical gradient norms and second moment of gradients.Through simulation experiments,the results showed that the new algorithm can effectively improve the convergence speed when historical gradient norms and second moment of gradients were combined.Through test of the new algorithm compared with the Adam optimization algorithm,the accuracy of the new algorithm was higher than that of Adam optimization algorithm,which validated its practical applicability.