SOTAVerified

Seq2Tens: An Efficient Representation of Sequences by Low-Rank Tensor Projections

2020-06-12ICLR 2021Code Available1· sign in to hype

Csaba Toth, Patric Bonnier, Harald Oberhauser

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Sequential data such as time series, video, or text can be challenging to analyse as the ordered structure gives rise to complex dependencies. At the heart of this is non-commutativity, in the sense that reordering the elements of a sequence can completely change its meaning. We use a classical mathematical object -- the tensor algebra -- to capture such dependencies. To address the innate computational complexity of high degree tensors, we use compositions of low-rank tensor projections. This yields modular and scalable building blocks for neural networks that give state-of-the-art performance on standard benchmarks such as multivariate time series classification and generative models for video.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
HMNISTGP-VAE (B-NLST)AUROC0.96Unverified
PhysioNet Challenge 2012GP-VAE (B-NLST)AUROC0.74Unverified
SpritesGP-VAE (B-NLST)MSE0Unverified

Reproductions