首页|LSPIA, (stochastic) gradient descent, and parameter correction

LSPIA, (stochastic) gradient descent, and parameter correction

扫码查看
We show that the LSPIA method for curve and surface approximation, which was introduced by Deng and Lin (2014), is equivalent to a gradient descent method. We also note that Deng and Lin's results concerning feasible values of the stepsize are directly implied by classical results about convergence properties of the gradient descent method. We propose a modification based on stochastic gradient descent, which lends itself to a realization that employs the technology of neural networks. In addition, we show how to incorporate the optimization of the parameterization of the given data into this framework via parameter correction (PC). This leads to the new LSPIA-PC method and its neural-network based implementation. Numerical experiments indicate that it gives better results than LSPIA with comparable computational costs. (c) 2021 Elsevier B.V. All rights reserved.

Progressive iterative approximationNeural networksParameter correctionPROGRESSIVE ITERATIVE APPROXIMATIONB-SPLINE CURVESURFACE

Rios, Dany、Juettler, Bert

展开 >

Johannes Kepler Univ Linz

2022

Journal of Computational and Applied Mathematics

Journal of Computational and Applied Mathematics

EISCI
ISSN:0377-0427
年,卷(期):2022.406
  • 10
  • 32