SOTAVerified

Minimax-optimal and Locally-adaptive Online Nonparametric Regression

2024-10-04Unverified0· sign in to hype

Paul Liautaud, Pierre Gaillard, Olivier Wintenberger

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We study adversarial online nonparametric regression with general convex losses and propose a parameter-free learning algorithm that achieves minimax optimal rates. Our approach leverages chaining trees to compete against H\"older functions and establishes optimal regret bounds. While competing with nonparametric function classes can be challenging, they often exhibit local patterns - such as local H\"older continuity - that online algorithms can exploit. Without prior knowledge, our method dynamically tracks and adapts to different H\"older profiles by pruning a core chaining tree structure, aligning itself with local smoothness variations. This leads to the first computationally efficient algorithm with locally adaptive optimal rates for online regression in an adversarial setting. Finally, we discuss how these notions could be extended to a boosting framework, offering promising directions for future research.

Tasks

Reproductions