SOTAVerified

Temporal Attention Bottleneck is informative? Interpretability through Disentangled Generative Representations for Energy Time Series Disaggregation

2023-06-23ICML 2023Code Available0· sign in to hype

Khalid Oublal, Said Ladjal, David Benhaiem, Emmanuel le-borgne, François Roueff

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Generative models have garnered significant attention for their ability to address the challenge of source separation in disaggregation tasks. This approach holds promise for promoting energy conservation by enabling homeowners to obtain detailed information on their energy consumption solely through the analysis of aggregated load curves. Nevertheless, the model's ability to generalize and its interpretability remain two major challenges. To tackle these challenges, we deploy a generative model called TAB-VAE (Temporal Attention Bottleneck for Variational Auto-encoder), based on hierarchical architecture, addresses signature variability, and provides a robust, interpretable separation through the design of its informative representation of latent space. Our implementation and evaluation guidelines are available at https://github.com/oublalkhalid/TAB-VAE.

Tasks

Reproductions