Feature Extraction Algorithm of Sparse Vector Based on Conjugate Gradient Direction of SCN Functions
Sparse vector feature extraction refers to the use of various paradigms to constrain the solution during optimization,and to obtain a solution with sparse features,which is widely used in complex systems in machine learning,deep learning,big data analysis and other fields of feature extraction problems.A large number of studies have shown that some norms such as L0 norm,L1 norm,and L2 norm,have their own problems.The less accurate and sparse the norm is easier to solute,and the more accurate and sparser the norm is more difficult to solve.In this paper,a sparse vector feature discovery algorithm(CGDL)based on the conjugate gradient direction of SCN function is proposed.The sparse vector feature discovery can be established by a sparse feature extraction optimization model.Its objective function is a SCN function,and theL0norm is transformed to form a convex-concave minimax problem with special structure equivalent to bilevel programming.This kind of problem can solve the problems of sparse regression,image feature,and compression perception.This paper gives the detailed calculation steps and convergence analysis proof of the sparse feature extraction algorithm of the above model.A numerical comparison experiment is carried out on the effectiveness,complexity and convergence speed of the algorithm between the given real data set and the high-dimensional simulation data set.It is proved that this method is significantly superior to other comparison methods in accuracy and sparsity,and has a good convergence rate.