Construction and Approximation of Deep Networks with ReLU Activation Function
This paper studies the construction and approximation of deep networks with ReLU activation function.Firstly,taking a deep network with exponential approximation order to x2 on[-1,1]as a sub-network,a deep network is constructed to approximate any polynomial of degree n,and the upper bound estimation of approximation error is given.Then,with the help of univariate orthogonal Chebyshev polyno-mials,the tensor product theory and the methods of function approximation,the bivariate orthogonal poly-nomials and the deep network of two input are constructed,and the approximation estimation for continuous functions of two variables is obtained.