Neural Networks2022,Vol.14810.DOI:10.1016/j.neunet.2022.01.002

Joint learning adaptive metric and optimal classification hyperplane

Wang, Yidan Yang, Liming
Neural Networks2022,Vol.14810.DOI:10.1016/j.neunet.2022.01.002

Joint learning adaptive metric and optimal classification hyperplane

Wang, Yidan 1Yang, Liming1
扫码查看

作者信息

  • 1. Coll Sci,China Agr Univ
  • 折叠

Abstract

Metric learning has attracted a lot of interest in classification tasks due to its efficient performance. Most traditional metric learning methods are based on k-nearest neighbors (kNN) classifiers to make decisions, while the choice k affects the generalization. In this work, we propose an end-to-end metric learning framework. Specifically, a new linear metric learning (LMML) is first proposed to jointly learn adaptive metrics and the optimal classification hyperplanes, where dissimilar samples are separated by maximizing classification margin. Then a nonlinear metric learning model (called RLMML) is developed based on a bound nonlinear kernel function to extend LMML. The non-convexity of the proposed models makes them difficult to optimize. The half-quadratic optimization algorithms are developed to solve iteratively the problems, by which the optimal classification hyperplane and adaptive metric are alternatively optimized. Moreover, the resulting algorithms are proved to be convergent theoretically. Numerical experiments on different types of data sets show the effectiveness of the proposed algorithms. Finally, the Wilcoxon test shows also the feasibility and effectiveness of the proposed models. (C)& nbsp;2022 Elsevier Ltd. All rights reserved.

Key words

Metric learning/Optimal classification hyperplane/Maximum margin classification/Correntropy/Half quadratic optimization algorithm/CORRENTROPY

引用本文复制引用

出版年

2022
Neural Networks

Neural Networks

EISCI
ISSN:0893-6080
被引量1
参考文献量31
段落导航相关论文