首页|基于Adam自适应学习率的神经网络训练方法

基于Adam自适应学习率的神经网络训练方法

扫码查看
文章针对深度神经网络训练中的学习率调整问题,通过引入正则化项来优化Adam算法,以提高卷积神经网络的训练效果.在CIFAR-10数据集上的实验表明,基于正则化机制的Adam改进算法相较于传统的Adam优化算法缩短了训练时间,提高了测试和验证准确率,并降低了训练损失,表现出更好的泛化能力.
A Neural Network Training Method Based on Adam Adaptive Learning Rate
Aiming at the learning rate adjustment problem in deep neural network training,this paper introduces regularization term to optimize adaptive moment estimation,to improve the training effect of convolutional neural network.Experiments on CIFAR-10 data set show that compared with traditional adaptive moment estimation,the improved adaptive moment estimation based on regularization mechanism can reduce the training time,improve the test and verification accuracy,and reduce the training loss,showing better generalization ability.

neural networkadaptive moment estimationregularizationlearning rate

张天中

展开 >

郑州工业应用技术学院,河南郑州 451100

神经网络 Adam优化算法 正则化 学习率

2024

信息与电脑
北京电子控股有限责任公司

信息与电脑

影响因子:1.143
ISSN:1003-9767
年,卷(期):2024.36(6)