SOTAVerified

Excess risk bound for deep learning under weak dependence

2023-02-15Unverified0· sign in to hype

William Kengne

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

This paper considers deep neural networks for learning weakly dependent processes in a general framework that includes, for instance, regression estimation, time series prediction, time series classification. The -weak dependence structure considered is quite large and covers other conditions such as mixing, association, Firstly, the approximation of smooth functions by deep neural networks with a broad class of activation functions is considered. We derive the required depth, width and sparsity of a deep neural network to approximate any H\"older smooth function, defined on any compact set . Secondly, we establish a bound of the excess risk for the learning of weakly dependent observations by deep neural networks. When the target function is sufficiently smooth, this bound is close to the usual O(n^-1/2).

Tasks

Reproductions