SOTAVerified

Sample Complexity Bounds for Learning High-dimensional Simplices in Noisy Regimes

2022-09-09Unverified0· sign in to hype

Amir Hossein Saberi, Amir Najafi, Seyed Abolfazl Motahari, Babak H. Khalaj

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

In this paper, we find a sample complexity bound for learning a simplex from noisy samples. Assume a dataset of size n is given which includes i.i.d. samples drawn from a uniform distribution over an unknown simplex in R^K, where samples are assumed to be corrupted by a multi-variate additive Gaussian noise of an arbitrary magnitude. We prove the existence of an algorithm that with high probability outputs a simplex having a _2 distance of at most from the true simplex (for any >0). Also, we theoretically show that in order to achieve this bound, it is sufficient to have n(K^2/^2)e^(K/SNR^2) samples, where SNR stands for the signal-to-noise ratio. This result solves an important open problem and shows as long as SNR(K^1/2), the sample complexity of the noisy regime has the same order to that of the noiseless case. Our proofs are a combination of the so-called sample compression technique in ashtiani2018nearly, mathematical tools from high-dimensional geometry, and Fourier analysis. In particular, we have proposed a general Fourier-based technique for recovery of a more general class of distribution families from additive Gaussian noise, which can be further used in a variety of other related problems.

Tasks

Reproductions