可微神经结构搜索(differentiable neural architecture search,DNAS)作为近年来神经结构搜索的主流方法之一,通过结合基于梯度优化的搜索策略能够有效地搜索网络结构。然而,存在结构搜索稳定性差和模型复杂度高的问题。为了解决这两个问题,本文提出了一种结构范数正则化的可微神经结构搜索算法,提高了结构搜索的稳定性;设计了一种冗余边剪枝算法修剪网络结构中的冗余边,降低了最终模型的复杂度。本文在CIFAR10,CIFAR100,miniImageNet和胎儿心脏标准平面分类(fetal heart standard plane,FHSP)等4个数据集上进行了算法性能对比实验,与一系列当前最新的可微神经结构搜索算法相比,取得了最优的综合性能。
A differentiable neural architecture search algorithm with architecture norm regularization
Differentiable neural architecture search(DNAS)has emerged as a popular method for finding network architectures by using a gradient-based optimization search strategy.However,there have been issues with the instability of the network architecture search and high model complexity.To address these challenges,this paper introduces a novel differentiable neural architecture search algorithm with architecture norm regularization to enhance the stability of network architecture search.Additionally,a redundant edge pruning algorithm is proposed to reduce the complexity of the final model by pruning redundant edges in the network architecture.Comparative experiments on algorithm performance were conducted using four datasets:CIFAR10,CIFAR100,miniImageNet,and Fetal Heart Standard Plane classification(FHSP).The results demonstrate that,in comparison to several of the latest differentiable neural architecture search algorithms,the proposed algorithm achieved the best overall performance.
deep learningdifferentiable neural architecture searchpruningregularizationefficiently search network structures