大规模稀疏多目标优化问题(Sparse Multiobjective Optimization Problems,SMOPs)广泛存在于现实世界.为大规模SMOPs提出通用的解决方法,对于进化计算、控制论和机器学习等领域中的问题解决都具有推动作用.由于SMOPs具有高维决策空间和Pareto最优解稀疏的特性,现有的进化算法在解决SMOPs时,很容易陷入维数灾难的困境.针对这个问题,以稀疏分布的学习为切入点,提出了 一种基于在线学习稀疏特征的大规模多 目标进化算法(Large-scale Multiobjective Evolutio-nary Algorithm Based on Online Learning of Sparse Features,MOEA/OLSF).具体地,首先设计 了 一种在线学习稀疏特征的方法来挖掘非零变量;然后提出了一种稀疏遗传算子,用于非零变量的进一步搜索和子代解的生成,在非零变量搜索过程中,其二进制交叉和变异算子也用于控制解的稀疏性和多样性.与最新的优秀算法在不同规模的测试问题上的对比结果表明,所提算法在收敛速度和性能方面均更优.
Abstract
Large-scale sparse multiobjective optimization problems(SMOPs)are widespread in the real world.Proposing generic solutions for large-scale SMOPs can improve problem-solving in the fields of evolutionary computation,cybernetics,and machine learning.Due to the high-dimensional decision space and the sparse Pareto-optimal solutions of SMOPs,existing evolutionary al-gorithms are vulnerable to the curse of dimensionality when solving SMOPs.To address these problems,a large-scale multi-objec-tive evolutionary algorithm based on online learning of sparse features(MOEA/OLSF)is proposed,with the learning of sparse distribution as an entry point.Specifically,an online learning sparse features method is designed to mine nonzero variables.Then a sparse genetic operator is proposed for further searching nonzero variables and generating offspring solutions.Its binary crossover and mutation operators are used to control the sparsity and diversity of solutions in the nonzero variable mining process.The comparison results with the state-of-the-art algorithms on test problems with different scales show that the proposed algorithm outperforms the existing algorithm in terms of convergence speed and performance.