Lightweight Transformer via Unrolling of Mixed Graph Algorithms for Traffic Forecast
Ji Qi, Tam Thuc Do, Mingxiao Liu, Zhuoshi Pan, Yuzhe Li, Gene Cheung, H. Vicky Zhao
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/singularityundefined/unrolling-gsp-stforecastOfficialIn paperpytorch★ 1
Abstract
To forecast traffic with both spatial and temporal dimensions, we unroll a mixed-graph-based optimization algorithm into a lightweight and interpretable transformer-like neural net. Specifically, we construct two graphs: an undirected graph G^u capturing spatial correlations across geography, and a directed graph G^d capturing sequential relationships over time. We formulate a prediction problem for the future samples of signal x, assuming it is "smooth" with respect to both G^u and G^d, where we design new _2 and _1-norm variational terms to quantify and promote signal smoothness (low-frequency reconstruction) on a directed graph. We construct an iterative algorithm based on alternating direction method of multipliers (ADMM), and unroll it into a feed-forward network for data-driven parameter learning. We insert graph learning modules for G^u and G^d, which are akin to the self-attention mechanism in classical transformers. Experiments show that our unrolled networks achieve competitive traffic forecast performance as state-of-the-art prediction schemes, while reducing parameter counts drastically. Our code is available in https://github.com/SingularityUndefined/Unrolling-GSP-STForecast.