SOTAVerified

Formal limitations of sample-wise information-theoretic generalization bounds

2022-05-13Unverified0· sign in to hype

Hrayr Harutyunyan, Greg Ver Steeg, Aram Galstyan

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Some of the tightest information-theoretic generalization bounds depend on the average information between the learned hypothesis and a single training example. However, these sample-wise bounds were derived only for expected generalization gap. We show that even for expected squared generalization gap no such sample-wise information-theoretic bounds exist. The same is true for PAC-Bayes and single-draw bounds. Remarkably, PAC-Bayes, single-draw and expected squared generalization gap bounds that depend on information in pairs of examples exist.

Tasks

Reproductions