首页|On Approximation by Neural Networks with Optimized Activation Functions and Fixed Weights

On Approximation by Neural Networks with Optimized Activation Functions and Fixed Weights

扫码查看
Recently,Li[16]introduced three kinds of single-hidden layer feed-forward neural networks with optimized piecewise linear activation functions and fixed weights,and obtained the upper and lower bound estimations on the approximation accuracy of the FNNs,for continuous function defined on bounded intervals.In the present paper,we point out that there are some errors both in the definitions of the FNNs and in the proof of the upper estimations in[16].By using new methods,we also give right approximation rate estimations of the approximation by Li's neural networks.

Approximation ratemodulus of continuitymodulus of smoothnessneural network operators

Dansheng Yu、Yunyou Qian、Fengjun Li

展开 >

Department of Mathematics,Hangzhou Normal University,Hangzhou,Zhejiang 310036,China

School of Mathematics and Statistics,Ningxia University,Yinchuan,Ningxia 750021,China

NSFC

12061055

2023

分析、理论与应用(英文版)
南京大学

分析、理论与应用(英文版)

CSCD北大核心
影响因子:0.111
ISSN:1672-4070
年,卷(期):2023.39(1)
  • 2