首页|Study Findings on Neural Computation Detailed by a Researcher at University of K entucky (Orthogonal Gated Recurrent Unit With Neumann-Cayley Transformation)
Study Findings on Neural Computation Detailed by a Researcher at University of K entucky (Orthogonal Gated Recurrent Unit With Neumann-Cayley Transformation)
扫码查看
点击上方二维码区域,可以放大扫码查看
原文链接
NETL
NSTL
Research findings on neural computatio n are discussed in a new report. According to news reporting from the University of Kentucky by NewsRx journalists, research stated, "In recent years, using ort hogonal matrices has been shown to be a promising approach to improving recurren t neural networks (RNNs) with training, stability, and convergence, particularly to control gradients." The news editors obtained a quote from the research from University of Kentucky: "While gated recurrent unit (GRU) and long short-term memory (LSTM) architectur es address the vanishing gradient problem by using a variety of gates and memory cells, they are still prone to the exploding gradient problem. In this work, we analyze the gradients in GRU and propose the use of orthogonal matrices to prev ent exploding gradient problems and enhance long-term memory."
University of KentuckyComputationNeu ral Computation