首页|Attention meets long short-term memory: A deep learning network for traffic flow forecasting

Attention meets long short-term memory: A deep learning network for traffic flow forecasting

扫码查看
Accurate forecasting of future traffic flow has a wide range of applications, which is a fundamental component of intelligent transportation systems. However, timely and accurate traffic forecasting remains an open challenge due to the high nonlinearity and volatility of traffic flow data. Canonical long short-term memory (LSTM) networks are easily drawn to focus on min-to-min fluctuations rather than the long term dependencies of the traffic flow evolution. To address this issue, we propose to introduce an attention mechanism to the long short-term memory network for short-term traffic flow forecasting. The attention mechanism helps the network model to assign different weights to different inputs, focus on critical and important information, and make accurate predictions. Extensive experiments on four benchmark data sets show that the LSTM network equipped with an attention mechanism has superior performance compared with commonly used and state-of-the-art models. (C) 2021 Elsevier B.V. All rights reserved.

Intelligent transportation systemTraffic flow modelingTime series analysisDeep learningAttention mechanismNoise-immune learningNEURAL-NETWORKKALMAN FILTERPREDICTIONMODELREGRESSION

Fang, Weiwei、Zhuo, Wenhao、Yan, Jingwen、Song, Youyi、Jiang, Dazhi、Zhou, Teng

展开 >

Shantou Univ

Hong Kong Polytech Univ

2022

Physica

Physica

ISSN:0378-4371
年,卷(期):2022.587
  • 25
  • 46