SOTAVerified

Long-term Forecasting using Higher Order Tensor RNNs

2017-10-31ICLR 2018Code Available0· sign in to hype

Rose Yu, Stephan Zheng, Anima Anandkumar, Yisong Yue

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

We present Higher-Order Tensor RNN (HOT-RNN), a novel family of neural sequence architectures for multivariate forecasting in environments with nonlinear dynamics. Long-term forecasting in such systems is highly challenging, since there exist long-term temporal dependencies, higher-order correlations and sensitivity to error propagation. Our proposed recurrent architecture addresses these issues by learning the nonlinear dynamics directly using higher-order moments and higher-order state transition functions. Furthermore, we decompose the higher-order structure using the tensor-train decomposition to reduce the number of parameters while preserving the model performance. We theoretically establish the approximation guarantees and the variance bound for HOT-RNN for general sequence inputs. We also demonstrate 5% ~ 12% improvements for long-term prediction over general RNN and LSTM architectures on a range of simulated environments with nonlinear dynamics, as well on real-world time series data.

Tasks

Reproductions