SOTAVerified

Effective Sample Size and Generalization Bounds for Temporal Networks

2026-03-03Unverified0· sign in to hype

Barak Gahtan, Alex M. Bronstein

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Learning from time series is fundamentally different from learning from i.i.d.\ data: temporal dependence can make long sequences effectively information-poor, yet standard evaluation protocols conflate sequence length with statistical information. We propose a dependence-aware evaluation methodology that controls for effective sample size N_eff rather than raw length N, and provide end-to-end generalization guarantees for Temporal Convolutional Networks (TCNs) on β-mixing sequences. Our analysis combines a blocking/coupling reduction that extracts B = Θ(N/ N) approximately independent anchors with an architecture-aware Rademacher bound for _2,1-norm-controlled convolutional networks, yielding O(D p / B) complexity scaling in depth D and kernel size p. Empirically, we find that stronger temporal dependence can reduce generalization gaps when comparisons control for N_eff - a conclusion that reverses under standard fixed-N evaluation, with observed rates of N_eff^-0.9 to N_eff^-1.2 substantially faster than the worst-case O(N^-1/2) mixing-based prediction. Our results suggest that dependence-aware evaluation should become standard practice in temporal deep learning benchmarks.

Reproductions