SOTAVerified

NYTRO: When Subsampling Meets Early Stopping

2015-10-19Code Available0· sign in to hype

Tomas Angles, Raffaello Camoriano, Alessandro Rudi, Lorenzo Rosasco

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Early stopping is a well known approach to reduce the time complexity for performing training and model selection of large scale learning machines. On the other hand, memory/space (rather than time) complexity is the main constraint in many applications, and randomized subsampling techniques have been proposed to tackle this issue. In this paper we ask whether early stopping and subsampling ideas can be combined in a fruitful way. We consider the question in a least squares regression setting and propose a form of randomized iterative regularization based on early stopping and subsampling. In this context, we analyze the statistical and computational properties of the proposed method. Theoretical results are complemented and validated by a thorough experimental analysis.

Tasks

Reproductions