首页|Universality of gradient descent neural network training

Universality of gradient descent neural network training

扫码查看
It has been observed that design choices of neural networks are often crucial for their successful optimization. In this article, we therefore discuss the question if it is always possible to redesign a neural network so that it trains well with gradient descent. This yields the following universality result: If, for a given network, there is any algorithm that can find good network weights for a classification task, then there exists an extension of this network that reproduces the same forward model by mere gradient descent training. The construction is not intended for practical computations, but it provides some orientation on the possibilities of pre-trained networks in meta-learning and related approaches. (C)& nbsp;2022 Elsevier Ltd. All rights reserved.

Deep neural networksUniversalityGlobal minimaTuring machinesMeta-learningBiologically plausible learning

Welper, G.

展开 >

Dept Math,Univ Cent Florida

2022

Neural Networks

Neural Networks

EISCI
ISSN:0893-6080
年,卷(期):2022.150
  • 2
  • 39