首页|ReLU激活函数深度网络的构造与逼近

ReLU激活函数深度网络的构造与逼近

扫码查看
研究ReLU激活函数深度网络的构造与逼近问题.以一个在[-1,1]上对x2 具有指数逼近阶的深度ReLU网络作为子网络,构造逼近任意n次多项式的深度网络,并给出其逼近误差的上界估计.借助一元正交切比雪夫多项式、张量积理论和函数逼近的方法,构造二元正交多项式和两个输入的深度网络,同时得到了对二元连续函数的逼近估计.
Construction and Approximation of Deep Networks with ReLU Activation Function
This paper studies the construction and approximation of deep networks with ReLU activation function.Firstly,taking a deep network with exponential approximation order to x2 on[-1,1]as a sub-network,a deep network is constructed to approximate any polynomial of degree n,and the upper bound estimation of approximation error is given.Then,with the help of univariate orthogonal Chebyshev polyno-mials,the tensor product theory and the methods of function approximation,the bivariate orthogonal poly-nomials and the deep network of two input are constructed,and the approximation estimation for continuous functions of two variables is obtained.

ReLU activation functionapproximationChebyshev polynomialdeep networks

刘爱丽、陈志祥

展开 >

绍兴文理学院 数理信息学院,浙江 绍兴 312000

ReLU激活函数 逼近 切比雪夫多项式 深度网络

2024

绍兴文理学院学报
绍兴文理学院

绍兴文理学院学报

CHSSCD
影响因子:0.267
ISSN:1008-293X
年,卷(期):2024.44(2)
  • 28