? 2021 Elsevier B.V.Partial least squares (PLS) regression is a linear regression technique and plays an important role in dealing with high-dimensional regressors. Unfortunately, PLS is sensitive to outliers in datasets and consequentially produces a corrupted model. In this paper, we propose a robust method for PLS based on the idea of least trimmed squares (LTS), in which the objective is to minimize the sum of the smallest h squared residuals. However, solving an LTS problem is generally NP-hard. Inspired by the complementary idea of Sim and Hartley, we solve the inverse of the LTS problem instead and formulate it as a concave maximization problem, which is convex and can be solved in polynomial time. Classic PLS as well as two of the most efficient robust PLS methods, Partial Robust M (PRM) regression and RSIMPLS, are compared in this study. Results of both simulation and real data sets show the effectiveness and robustness of our approach.
Least trimmed squaresPartial least squaresRobust PLSSIMPLS
Xie Z.、Feng X.、Chen X.
展开 >
School of Marine Science and Technology Northwestern Polytechnical University
College of Electrical and Electronic Engineering Wenzhou University