Universal consistency of deep ReLU neural networks
With the explosive growth of data and richer computing resources,shallow neural networks can not always meet the requirements of the times,resulting in the emergence of deep neural networks.The rapid development of deep neural networks is mainly reflected in applications,and the theoretical research is relatively scarce.This paper focuses on the universal consistency of deep ReLU neural networks.The contents include:firstly,whether there is a deep neural network with a unified structure(i.e.,the depth,width,and activation function have been determined)that can learn more features and has universal approximation;secondly,the determined deep neural network model has the property of universal consistency;finally,we verify the theoretical results from the perspective of experiments.
deep neural networksuniversal consistencydeep learningReLU functionapproximation