SOTAVerified

Enhancing Masked Time-Series Modeling via Dropping Patches

2024-12-19Code Available0· sign in to hype

Tianyu Qiu, Yi Xie, Yun Xiong, Hao Niu, Xiaofeng Gao

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

This paper explores how to enhance existing masked time-series modeling by randomly dropping sub-sequence level patches of time series. On this basis, a simple yet effective method named DropPatch is proposed, which has two remarkable advantages: 1) It improves the pre-training efficiency by a square-level advantage; 2) It provides additional advantages for modeling in scenarios such as in-domain, cross-domain, few-shot learning and cold start. This paper conducts comprehensive experiments to verify the effectiveness of the method and analyze its internal mechanism. Empirically, DropPatch strengthens the attention mechanism, reduces information redundancy and serves as an efficient means of data augmentation. Theoretically, it is proved that DropPatch slows down the rate at which the Transformer representations collapse into the rank-1 linear subspace by randomly dropping patches, thus optimizing the quality of the learned representations

Tasks

Reproductions