首页|导数推广的拉格朗日插值公式及其在密文训练神经网络中的应用

导数推广的拉格朗日插值公式及其在密文训练神经网络中的应用

扫码查看
拉格朗日多项式插值公式被用于经过若干点的未知函数的多项式刻画,也被用于对已有的非多项式函数进行多项式函数逼近,在许多领域有着广泛的应用.本文探索了拉格朗日多项式插值公式在导数情况下的一般性推广.推广的多项式插值公式不仅可以在插值点的函数值逼近目标函数,同时也能在插值点的变化趋势上对目标函数进行逼近.在已知插值点上给定某个阶数以内导数的情况下,能够使用推广的拉格朗日多项式插值公式获得对目标函数的深度多项式逼近.实验结果表明,用导数推广拉格朗日多项式代替逻辑回归函数重新构建的密文神经网络的训练准确率更高,均方误差更小.扩展后的拉格朗日多项式插值公式能够适用于更一般的场景.
Derivation Extension of Lagrange Polynomial Interpolation and Its Application in Cipher-text Training Neural Network
The Lagrange Polynomial Interpolation is applied to polynomially characterize an unknown function passing through several points,or to approximate a known non-polynomial function with polyno-mial functions,widely used in many fields. In this paper,a derivation extension of Lagrange Polynomi-al Interpolation capable of approximating the target function with the values as well as the variation tend-ency of the interpolations is presented. Knowing each derivation within the given order at the interpola-tion points,target function could be polynomially approximated more deeply using the derivation exten-sion of Lagrange Polynomial Interpolation. Experiment results show that the cipher-text neural network constructed by the derivation extension of Lagrange Polynomial Interpolation instead of the logistic func-tion has higher training accuracy and smaller mean square error,indicating that derivation extension of Lagrange Polynomial Interpolation could be applied to more general scenarios.

Derivation extension of Lagrange Polynomial InterpolationCipher-text Training Neural Networkpolynomial approximationactivation function

杨舒雅、李晓东、张健毅

展开 >

北京电子科技学院,北京市 100070

拉格朗日插值公式导数推广 密文训练神经网络 多项式逼近 激励函数

2024

北京电子科技学院学报
北京电子科技学院

北京电子科技学院学报

影响因子:0.245
ISSN:1672-464X
年,卷(期):2024.32(1)