SOTAVerified

Estimating the Local Learning Coefficient at Scale

2024-02-06Code Available0· sign in to hype

Zach Furman, Edmund Lau

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

The local learning coefficient (LLC) is a principled way of quantifying model complexity, originally derived in the context of Bayesian statistics using singular learning theory (SLT). Several methods are known for numerically estimating the local learning coefficient, but so far these methods have not been extended to the scale of modern deep learning architectures or data sets. Using a method developed in arXiv:2308.12108 [stat.ML] we empirically show how the LLC may be measured accurately and self-consistently for deep linear networks (DLNs) up to 100M parameters. We also show that the estimated LLC has the rescaling invariance that holds for the theoretical quantity.

Tasks

Reproductions