SOTAVerified

Sparsified-Learning for Heavy-Tailed Locally Stationary Processes

2025-04-08Unverified0· sign in to hype

Yingjie Wang, Mokhtar Z. Alaya, Salim Bouzebda, Xinsheng Liu

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Sparsified Learning is ubiquitous in many machine learning tasks. It aims to regularize the objective function by adding a penalization term that considers the constraints made on the learned parameters. This paper considers the problem of learning heavy-tailed LSP. We develop a flexible and robust sparse learning framework capable of handling heavy-tailed data with locally stationary behavior and propose concentration inequalities. We further provide non-asymptotic oracle inequalities for different types of sparsity, including _1-norm and total variation penalization for the least square loss.

Tasks

Reproductions