SOTAVerified

PAC-Bayes Bounds for Meta-learning with Data-Dependent Prior

2021-02-07Code Available0· sign in to hype

Tianyu Liu, Jie Lu, Zheng Yan, Guangquan Zhang

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

By leveraging experience from previous tasks, meta-learning algorithms can achieve effective fast adaptation ability when encountering new tasks. However it is unclear how the generalization property applies to new tasks. Probably approximately correct (PAC) Bayes bound theory provides a theoretical framework to analyze the generalization performance for meta-learning. We derive three novel generalisation error bounds for meta-learning based on PAC-Bayes relative entropy bound. Furthermore, using the empirical risk minimization (ERM) method, a PAC-Bayes bound for meta-learning with data-dependent prior is developed. Experiments illustrate that the proposed three PAC-Bayes bounds for meta-learning guarantee a competitive generalization performance guarantee, and the extended PAC-Bayes bound with data-dependent prior can achieve rapid convergence ability.

Tasks

Reproductions