Continuous Evolutionary Neural Architecture Search Based on Biased Sampling
Neural Architecture Search(NAS)typically requires a considerable amount of time and computing resources due to the independent performance evaluation of each architecture it searches.To address this challenge,the continuous evolutionary NAS method based on biased sampling(OEvNAS)is proposed.This method involves the maintenance of a supernet during the architecture search,where all neural network architectures within the search space are subsets of this supernet.Throughout each evolutionary computation generation,the supernet is trained for a few epochs.Subsequently,the subnets inherit the supernet's weights for performance evaluation,eliminating the need for retraining.To enhance the supernet's prediction performance,a training strategy based on biased sampling is introduced.This strategy prioritizes training superior networks,thereby augmenting training efficiency and diminishing weight coupling.Additionally,an innovative crossover and mutation strategy is implemented to enhance global exploration capabilities.The effectiveness of OEvNAS is tested on two search spaces,NATS-Bench and Differentaible Architecture Search(DARTS).Results indicate that OEvNAS outperforms comparative leading algorithms.In the NATS-Bench search space,the new supernet training strategy demonstrates remarkable prediction accuracy on CIFAR-10,CIFAR-100 and ImageNet16-200.In the DARTS search space,the optimally searched neural network architecture exhibits classification accuracies of 97.67% and 83.79% on CIFAR-10 and CIFAR-100,respectively.