绍兴文理学院学报2024,Vol.44Issue(2) :58-68.DOI:10.16169/j.issn.1008-293x.2024.02.006

ReLU激活函数深度网络的构造与逼近

Construction and Approximation of Deep Networks with ReLU Activation Function

刘爱丽 陈志祥
绍兴文理学院学报2024,Vol.44Issue(2) :58-68.DOI:10.16169/j.issn.1008-293x.2024.02.006

ReLU激活函数深度网络的构造与逼近

Construction and Approximation of Deep Networks with ReLU Activation Function

刘爱丽 1陈志祥1
扫码查看

作者信息

  • 1. 绍兴文理学院 数理信息学院,浙江 绍兴 312000
  • 折叠

摘要

研究ReLU激活函数深度网络的构造与逼近问题.以一个在[-1,1]上对x2 具有指数逼近阶的深度ReLU网络作为子网络,构造逼近任意n次多项式的深度网络,并给出其逼近误差的上界估计.借助一元正交切比雪夫多项式、张量积理论和函数逼近的方法,构造二元正交多项式和两个输入的深度网络,同时得到了对二元连续函数的逼近估计.

Abstract

This paper studies the construction and approximation of deep networks with ReLU activation function.Firstly,taking a deep network with exponential approximation order to x2 on[-1,1]as a sub-network,a deep network is constructed to approximate any polynomial of degree n,and the upper bound estimation of approximation error is given.Then,with the help of univariate orthogonal Chebyshev polyno-mials,the tensor product theory and the methods of function approximation,the bivariate orthogonal poly-nomials and the deep network of two input are constructed,and the approximation estimation for continuous functions of two variables is obtained.

关键词

ReLU激活函数/逼近/切比雪夫多项式/深度网络

Key words

ReLU activation function/approximation/Chebyshev polynomial/deep networks

引用本文复制引用

出版年

2024
绍兴文理学院学报
绍兴文理学院

绍兴文理学院学报

CHSSCD
影响因子:0.267
ISSN:1008-293X
参考文献量28
段落导航相关论文