首页|Incremental learning algorithm for large-scale semi-supervised ordinal regression

Incremental learning algorithm for large-scale semi-supervised ordinal regression

扫码查看
As a special case of multi-classification, ordinal regression (also known as ordinal classification) is a popular method to tackle the multi-class problems with samples marked by a set of ranks. Semi supervised ordinal regression (SSOR) is especially important for data mining applications because semi-supervised learning can make use of the unlabeled samples to train a high-quality learning model. However, the training of large-scale SSOR is still an open question due to its complicated formulations and non-convexity to the best of our knowledge. To address this challenging problem, in this paper, we propose an incremental learning algorithm for SSOR (IL-SSOR), which can directly update the solution of SSOR based on the KKT conditions. More critically, we analyze the finite convergence of IL-SSOR which guarantees that SSOR can converge to a local minimum based on the framework of concave-convex procedure. To the best of our knowledge, the proposed new algorithm is the first efficient on-line learning algorithm for SSOR with local minimum convergence guarantee. The experimental results show, IL-SSOR can achieve better generalization than other semi-supervised multi-class algorithms. Compared with other semi-supervised ordinal regression algorithms, our experimental results show that IL-SSOR can achieve similar generalization with less running time. (C) 2022 Elsevier Ltd. All rights reserved.

Semi-supervised ordinal regressionIncremental learningConcave-Convex procedure algorithmPath following algorithm

Chen, Haiyan、Jia, Yizhen、Ge, Jiaming、Gu, Bin

展开 >

Coll Comp Sci & Technol,Nanjing Univ Aeronaut & Astronaut

Coll Comp & Software,Nanjing Univ Informat Sci & Technol

2022

Neural Networks

Neural Networks

EISCI
ISSN:0893-6080
年,卷(期):2022.149
  • 3
  • 39