Fi^2VTS: Time Series Forecasting Via Capturing Intra- and Inter-Variable Variations in the Frequency Domain
Rujia Shen, Yang Yang, Yaoxion Lin, Liangliang Liu, Boran Wang, Yi Guan, Jingchi Jiang
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/hitshenrj/fi2vtsOfficialIn paperpytorch★ 0
Abstract
Time series forecasting (TSF) plays a crucial role in various applications, including medical monitoring and crop growth. Despite the advancements in deep learning methods for TSF, their capacity to predict long-term series remains constrained. This limitation arises from the failure to account for both intra- and inter-variable variations meanwhile. To mitigate this challenge, we introduce the Fi^2VBlock, which leverages a Frequency domain perspective to capture intra- and inter-variable Variations. After transforming into the frequency domain via the Frequency Transform Module, the Frequency Cross Attention between the real and imaginary parts is designed to obtain enhanced frequency representations and capture intra-variable variations. Furthermore, Inception blocks are employed to integrate information, thus capturing correlations across different variables. Our backbone network, Fi^2VTS, employs a residual architecture by concatenating multiple Fi^2VBlocks, thereby preventing degradation issues. Theoretically, we demonstrate that Fi^2VTS achieves a substantial reduction in both time and memory complexity, decreasing from O(L^2) to O(L) per Fi^2VBlock computation. Empirical evaluations reveal that Fi^2VTS outperforms other baselines on two benchmark datasets. The implementation code is accessible at https://github.com/HITshenrj/Fi2VTS.