Few-Shot Learning as Domain Adaptation: Algorithm and Analysis
Jiechao Guan, Zhiwu Lu, Tao Xiang, Ji-Rong Wen
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
To recognize the unseen classes with only few samples, few-shot learning (FSL) uses prior knowledge learned from the seen classes. A major challenge for FSL is that the distribution of the unseen classes is different from that of those seen, resulting in poor generalization even when a model is meta-trained on the seen classes. This class-difference-caused distribution shift can be considered as a special case of domain shift. In this paper, for the first time, we propose a domain adaptation prototypical network with attention (DAPNA) to explicitly tackle such a domain shift problem in a meta-learning framework. Specifically, armed with a set transformer based attention module, we construct each episode with two sub-episodes without class overlap on the seen classes to simulate the domain shift between the seen and unseen classes. To align the feature distributions of the two sub-episodes with limited training samples, a feature transfer network is employed together with a margin disparity discrepancy (MDD) loss. Importantly, theoretical analysis is provided to give the learning bound of our DAPNA. Extensive experiments show that our DAPNA outperforms the state-of-the-art FSL alternatives, often by significant margins.
Tasks
Benchmark Results
| Dataset | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| Mini-Imagenet 5-way (1-shot) | DAPNA | Accuracy | 71.88 | — | Unverified |
| Mini-Imagenet 5-way (5-shot) | DAPNA | Accuracy | 84.07 | — | Unverified |
| Mini-ImageNet-CUB 5-way (1-shot) | DAPNA | Accuracy | 49.44 | — | Unverified |
| Mini-ImageNet-CUB 5-way (5-shot) | DAPNA | Accuracy | 68.33 | — | Unverified |
| Tiered ImageNet 5-way (1-shot) | DAPNA | Accuracy | 69.14 | — | Unverified |
| Tiered ImageNet 5-way (5-shot) | DAPNA | Accuracy | 85.82 | — | Unverified |