Physica2022,Vol.58714.DOI:10.1016/j.physa.2021.126485

Attention meets long short-term memory: A deep learning network for traffic flow forecasting

Fang, Weiwei Zhuo, Wenhao Yan, Jingwen Song, Youyi Jiang, Dazhi Zhou, Teng
Physica2022,Vol.58714.DOI:10.1016/j.physa.2021.126485

Attention meets long short-term memory: A deep learning network for traffic flow forecasting

Fang, Weiwei 1Zhuo, Wenhao 1Yan, Jingwen 1Song, Youyi 2Jiang, Dazhi 1Zhou, Teng1
扫码查看

作者信息

  • 1. Shantou Univ
  • 2. Hong Kong Polytech Univ
  • 折叠

Abstract

Accurate forecasting of future traffic flow has a wide range of applications, which is a fundamental component of intelligent transportation systems. However, timely and accurate traffic forecasting remains an open challenge due to the high nonlinearity and volatility of traffic flow data. Canonical long short-term memory (LSTM) networks are easily drawn to focus on min-to-min fluctuations rather than the long term dependencies of the traffic flow evolution. To address this issue, we propose to introduce an attention mechanism to the long short-term memory network for short-term traffic flow forecasting. The attention mechanism helps the network model to assign different weights to different inputs, focus on critical and important information, and make accurate predictions. Extensive experiments on four benchmark data sets show that the LSTM network equipped with an attention mechanism has superior performance compared with commonly used and state-of-the-art models. (C) 2021 Elsevier B.V. All rights reserved.

Key words

Intelligent transportation system/Traffic flow modeling/Time series analysis/Deep learning/Attention mechanism/Noise-immune learning/NEURAL-NETWORK/KALMAN FILTER/PREDICTION/MODEL/REGRESSION

引用本文复制引用

出版年

2022
Physica

Physica

ISSN:0378-4371
被引量25
参考文献量46
段落导航相关论文