SOTAVerified

PDFormer: Propagation Delay-Aware Dynamic Long-Range Transformer for Traffic Flow Prediction

2023-01-19Code Available2· sign in to hype

Jiawei Jiang, Chengkai Han, Wayne Xin Zhao, Jingyuan Wang

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

As a core technology of Intelligent Transportation System, traffic flow prediction has a wide range of applications. The fundamental challenge in traffic flow prediction is to effectively model the complex spatial-temporal dependencies in traffic data. Spatial-temporal Graph Neural Network (GNN) models have emerged as one of the most promising methods to solve this problem. However, GNN-based models have three major limitations for traffic prediction: i) Most methods model spatial dependencies in a static manner, which limits the ability to learn dynamic urban traffic patterns; ii) Most methods only consider short-range spatial information and are unable to capture long-range spatial dependencies; iii) These methods ignore the fact that the propagation of traffic conditions between locations has a time delay in traffic systems. To this end, we propose a novel Propagation Delay-aware dynamic long-range transFormer, namely PDFormer, for accurate traffic flow prediction. Specifically, we design a spatial self-attention module to capture the dynamic spatial dependencies. Then, two graph masking matrices are introduced to highlight spatial dependencies from short- and long-range views. Moreover, a traffic delay-aware feature transformation module is proposed to empower PDFormer with the capability of explicitly modeling the time delay of spatial information propagation. Extensive experimental results on six real-world public traffic datasets show that our method can not only achieve state-of-the-art performance but also exhibit competitive computational efficiency. Moreover, we visualize the learned spatial-temporal attention map to make our model highly interpretable.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
PeMS04PDFormer12 Steps MAE18.32Unverified
PeMS07PDFormerMAE@1h19.83Unverified
PeMS08PDFormerMAE@1h13.58Unverified
PeMSD4PDFormer12 steps MAE18.32Unverified
PeMSD7PDFormer12 steps MAE19.83Unverified
PeMSD8PDFormer12 steps MAE13.58Unverified

Reproductions