SOTAVerified

Theory-to-Practice Gap for Neural Networks and Neural Operators

2025-03-23Unverified0· sign in to hype

Philipp Grohs, Samuel Lanthaler, Margaret Trautner

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

This work studies the sampling complexity of learning with ReLU neural networks and neural operators. For mappings belonging to relevant approximation spaces, we derive upper bounds on the best-possible convergence rate of any learning algorithm, with respect to the number of samples. In the finite-dimensional case, these bounds imply a gap between the parametric and sampling complexities of learning, known as the theory-to-practice gap. In this work, a unified treatment of the theory-to-practice gap is achieved in a general L^p-setting, while at the same time improving available bounds in the literature. Furthermore, based on these results the theory-to-practice gap is extended to the infinite-dimensional setting of operator learning. Our results apply to Deep Operator Networks and integral kernel-based neural operators, including the Fourier neural operator. We show that the best-possible convergence rate in a Bochner L^p-norm is bounded by Monte-Carlo rates of order 1/p.

Tasks

Reproductions