SOTAVerified

Contextualizing MLP-Mixers Spatiotemporally for Urban Data Forecast at Scale

2023-07-04Code Available1· sign in to hype

Tong Nie, Guoyang Qin, Lijun Sun, Wei Ma, Yu Mei, Jian Sun

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Spatiotemporal traffic data (STTD) displays complex correlational structures. Extensive advanced techniques have been designed to capture these structures for effective forecasting. However, because STTD is often massive in scale, practitioners need to strike a balance between effectiveness and efficiency using computationally efficient models. An alternative paradigm based on multilayer perceptron (MLP) called MLP-Mixer has the potential for both simplicity and effectiveness. Taking inspiration from its success in other domains, we propose an adapted version, named NexuSQN, for STTD forecast at scale. We first identify the challenges faced when directly applying MLP-Mixers as seriesand window-wise multivaluedness. To distinguish between spatial and temporal patterns, the concept of ST-contextualization is then proposed. Our results surprisingly show that this simple-yeteffective solution can rival SOTA baselines when tested on several traffic benchmarks. Furthermore, NexuSQN has demonstrated its versatility across different domains, including energy and environment data, and has been deployed in a collaborative project with Baidu to predict congestion in megacities like Beijing and Shanghai. Our findings contribute to the exploration of simple-yet-effective models for real-world STTD forecasting.

Tasks

Reproductions