Latent ODEs for Irregularly-Sampled Time Series
Yulia Rubanova, Ricky T. Q. Chen, David Duvenaud
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/YuliaRubanova/latent_odeOfficialpytorch★ 0
- github.com/patrick-kidger/torchcdepytorch★ 476
- github.com/jacobjinkelly/easy-neural-odejax★ 287
- github.com/BorealisAI/continuous-time-flow-processpytorch★ 48
- github.com/HerreraKrachTeichmann/NJODEpytorch★ 31
- github.com/HerreraKrachTeichmann/ControlledODERNNpytorch★ 31
- github.com/westny/neural-stabilitypytorch★ 8
- github.com/Ldhlwh/Latent-ODEpytorch★ 0
- github.com/gkrudah/ODEnetpytorch★ 0
- github.com/ashysheya/ODE-RNNpytorch★ 0
Abstract
Time series with non-uniform intervals occur in many applications, and are difficult to model using standard recurrent neural networks (RNNs). We generalize RNNs to have continuous-time hidden dynamics defined by ordinary differential equations (ODEs), a model we call ODE-RNNs. Furthermore, we use ODE-RNNs to replace the recognition network of the recently-proposed Latent ODE model. Both ODE-RNNs and Latent ODEs can naturally handle arbitrary time gaps between observations, and can explicitly model the probability of observation times using Poisson processes. We show experimentally that these ODE-based models outperform their RNN-based counterparts on irregularly-sampled data.
Tasks
Benchmark Results
| Dataset | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| MuJoCo | Latent ODE (ODE enc) | MSE (10^-2, 50% missing) | 1.26 | — | Unverified |
| MuJoCo | ODE-RNN | MSE (10^-2, 50% missing) | 26.46 | — | Unverified |
| PhysioNet Challenge 2012 | Latent ODE (ODE enc) | mse (10^-3) | 2.23 | — | Unverified |
| PhysioNet Challenge 2012 | Latent ODE + Poisson | mse (10^-3) | 2.21 | — | Unverified |