首页|Continuous-Discrete Alignment Optimization for efficient differentiable neural architecture search
Continuous-Discrete Alignment Optimization for efficient differentiable neural architecture search
扫码查看
点击上方二维码区域,可以放大扫码查看
原文链接
NETL
NSTL
Elsevier
Differential Architecture Search (DARTS) has become a prominent technique for neural architecture search in recent years. Despite its merits, the issue of discretization discrepancy within DARTS still necessitates further exploration, as it can degrade in performance. In this paper, we introduce a novel algorithm termed Continuous-Discrete Alignment Optimization (DARTS-CDAO), designed to address the discretization discrepancy and thereby enhance the robustness and generalization capabilities of the discovered neural architectures. Our proposed DARTS-CDAO algorithm seamlessly integrates the discretization process into the training phase of the architecture parameters, thereby bolstering the search algorithm's adaptability to the inherent discretization processes. Specifically, our methodology commences by formalizing the process of architecture parameter discretization. Subsequently, we introduce a coarse gradient weighting algorithm that is employed to update the architecture parameters, effectively minimizing the divergence between the representation of continuous and discrete parameters. Rigorous theoretical analysis, coupled with extensive experimental outcomes, substantiates that our proposed approach can elevate the performance of the searched models. Notably, this enhancement is achieved without incurring additional search time, rendering DARTS more robust and endowed with a heightened capacity for generalization.