Neural Networks2022,Vol.14918.DOI:10.1016/j.neunet.2022.02.003

Backpropagation Neural Tree

Ojha, Varun Nicosia, Giuseppe
Neural Networks2022,Vol.14918.DOI:10.1016/j.neunet.2022.02.003

Backpropagation Neural Tree

Ojha, Varun 1Nicosia, Giuseppe2
扫码查看

作者信息

  • 1. Dept Comp Sci,Univ Reading
  • 2. Ctr Syst Biol,Univ Cambridge
  • 折叠

Abstract

We propose a novel algorithm called Backpropagation Neural Tree (BNeuralT), which is a stochastic computational dendritic tree. BNeuralT takes random repeated inputs through its leaves and imposes dendritic nonlinearities through its internal connections like a biological dendritic tree would do. Considering the dendritic-tree like plausible biological properties, BNeuralT is a single neuron neural tree model with its internal sub-trees resembling dendritic nonlinearities. BNeuralT algorithm produces an ad hoc neural tree which is trained using a stochastic gradient descent optimizer like gradient descent (GD), momentum GD, Nesterov accelerated GD, Adagrad, RMSprop, or Adam. BNeuralT training has two phases, each computed in a depth-first search manner: the forward pass computes neural tree's output in a post-order traversal, while the error backpropagation during the backward pass is performed recursively in a pre-order traversal. A BNeuralT model can be considered a minimal subset of a neural network (NN), meaning it is a "thinned "NN whose complexity is lower than an ordinary NN. Our algorithm produces high-performing and parsimonious models balancing the complexity with descriptive ability on a wide variety of machine learning problems: classification, regression, and pattern recognition. (C)& nbsp;2022 Elsevier Ltd. All rights reserved.

Key words

Stochastic gradient descent/RMSprop/Backpropagation/Minimal architecture/Neural networks/Neural trees/ENSEMBLE/LIBRARY

引用本文复制引用

出版年

2022
Neural Networks

Neural Networks

EISCI
ISSN:0893-6080
被引量5
参考文献量57
段落导航相关论文