SOTAVerified

Deep Network Approximation for Smooth Functions

2020-01-09Unverified0· sign in to hype

Jianfeng Lu, Zuowei Shen, Haizhao Yang, Shijun Zhang

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

This paper establishes the (nearly) optimal approximation error characterization of deep rectified linear unit (ReLU) networks for smooth functions in terms of both width and depth simultaneously. To that end, we first prove that multivariate polynomials can be approximated by deep ReLU networks of width O(N) and depth O(L) with an approximation error O(N^-L). Through local Taylor expansions and their deep ReLU network approximations, we show that deep ReLU networks of width O(N N) and depth O(L L) can approximate f C^s([0,1]^d) with a nearly optimal approximation error O(\|f\|_C^s([0,1]^d)N^-2s/dL^-2s/d). Our estimate is non-asymptotic in the sense that it is valid for arbitrary width and depth specified by NN^+ and LN^+, respectively.

Tasks

Reproductions