Journal of Computational and Applied Mathematics2022,Vol.40612.DOI:10.1016/j.cam.2021.113921

LSPIA, (stochastic) gradient descent, and parameter correction

Rios, Dany Juettler, Bert
Journal of Computational and Applied Mathematics2022,Vol.40612.DOI:10.1016/j.cam.2021.113921

LSPIA, (stochastic) gradient descent, and parameter correction

Rios, Dany 1Juettler, Bert1
扫码查看

作者信息

  • 1. Johannes Kepler Univ Linz
  • 折叠

Abstract

We show that the LSPIA method for curve and surface approximation, which was introduced by Deng and Lin (2014), is equivalent to a gradient descent method. We also note that Deng and Lin's results concerning feasible values of the stepsize are directly implied by classical results about convergence properties of the gradient descent method. We propose a modification based on stochastic gradient descent, which lends itself to a realization that employs the technology of neural networks. In addition, we show how to incorporate the optimization of the parameterization of the given data into this framework via parameter correction (PC). This leads to the new LSPIA-PC method and its neural-network based implementation. Numerical experiments indicate that it gives better results than LSPIA with comparable computational costs. (c) 2021 Elsevier B.V. All rights reserved.

Key words

Progressive iterative approximation/Neural networks/Parameter correction/PROGRESSIVE ITERATIVE APPROXIMATION/B-SPLINE CURVE/SURFACE

引用本文复制引用

出版年

2022
Journal of Computational and Applied Mathematics

Journal of Computational and Applied Mathematics

EISCI
ISSN:0377-0427
被引量10
参考文献量32
段落导航相关论文