Temporal Fusion Transformers for Interpretable Multi-horizon Time Series Forecasting
Bryan Lim, Sercan O. Arik, Nicolas Loeff, Tomas Pfister
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/unit8co/dartspytorch★ 9,275
- github.com/jdb78/pytorch-forecastingpytorch★ 4,841
- github.com/etna-team/etnapytorch★ 193
- github.com/aryan-jadon/regression-loss-functions-in-time-series-forecasting-tensorflowtf★ 86
- github.com/eeci/annex_37pytorch★ 8
- github.com/mavarick/tftpytorch★ 6
- github.com/bobbercheng/ventilator-pressure-predictiontf★ 1
- github.com/IKKIM00/multi-horizon-forecasting-comparison-between-TFT-and-DL-methodspytorch★ 0
- github.com/LiamMaclean216/Temporal-Fusion-Transformerpytorch★ 0
- github.com/ntubiolin/tfttf★ 0
Abstract
Multi-horizon forecasting problems often contain a complex mix of inputs -- including static (i.e. time-invariant) covariates, known future inputs, and other exogenous time series that are only observed historically -- without any prior information on how they interact with the target. While several deep learning models have been proposed for multi-step prediction, they typically comprise black-box models which do not account for the full range of inputs present in common scenarios. In this paper, we introduce the Temporal Fusion Transformer (TFT) -- a novel attention-based architecture which combines high-performance multi-horizon forecasting with interpretable insights into temporal dynamics. To learn temporal relationships at different scales, the TFT utilizes recurrent layers for local processing and interpretable self-attention layers for learning long-term dependencies. The TFT also uses specialized components for the judicious selection of relevant features and a series of gating layers to suppress unnecessary components, enabling high performance in a wide range of regimes. On a variety of real-world datasets, we demonstrate significant performance improvements over existing benchmarks, and showcase three practical interpretability use-cases of TFT.