SOTAVerified

Learning under Distributional Drift: Prequential Reproducibility as an Intrinsic Statistical Resource

2026-03-04Unverified0· sign in to hype

Sofiya Zaichyk

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Statistical learning under distributional drift remains poorly characterized, especially in closed-loop settings where learning alters the data-generating law. We introduce an intrinsic drift budget C_T that quantifies the cumulative information-geometric motion of the data distribution along the realized learner-environment trajectory, measured in Fisher-Rao distance (the Riemannian metric induced by Fisher information on a statistical manifold of data-generating laws). The budget decomposes this motion into exogenous change (environmental drift that would occur without intervention) and policy-sensitive feedback contributions (drift induced by the learner's actions through the closed loop). This yields a rate-based characterization: in prequential reproducibility bounds -- where performance on the realized stream is used to predict one-step-ahead performance under the next distribution -- the drift contribution enters through the average drift rate C_T/T, i.e., normalized cumulative Fisher-Rao motion per time step. We prove a drift--feedback bound of order T^-1/2 + C_T/T (up to a controlled second-order remainder) and establish a matching minimax lower bound on a canonical subclass, showing this dependence is tight up to constants. Consequently, when C_T/T is nonnegligible, one-step-ahead reproducibility admits an irreducible accuracy floor of the same order. Finally, the framework places exogenous drift, adaptive data analysis, and performative feedback within a common geometric account of distributional motion.

Reproductions