SOTAVerified

Anti-aliasing Semantic Reconstruction for Few-Shot Semantic Segmentation

2021-06-01CVPR 2021Code Available1· sign in to hype

Binghao Liu, Yao Ding, Jianbin Jiao, Xiangyang Ji, Qixiang Ye

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Encouraging progress in few-shot semantic segmentation has been made by leveraging features learned upon base classes with sufficient training data to represent novel classes with few-shot examples. However, this feature sharing mechanism inevitably causes semantic aliasing between novel classes when they have similar compositions of semantic concepts. In this paper, we reformulate few-shot segmentation as a semantic reconstruction problem, and convert base class features into a series of basis vectors which span a class-level semantic space for novel class reconstruction. By introducing contrastive loss, we maximize the orthogonality of basis vectors while minimizing semantic aliasing between classes. Within the reconstructed representation space, we further suppress interference from other classes by projecting query features to the support vector for precise semantic activation. Our proposed approach, referred to as anti-aliasing semantic reconstruction (ASR), provides a systematic yet interpretable solution for few-shot learning problems. Extensive experiments on PASCAL VOC and MS COCO datasets show that ASR achieves strong results compared with the prior works.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
COCO-20i (1-shot)ASR (ResNet-50)Mean IoU33.85Unverified
COCO-20i (5-shot)ASR (ResNet-50)Mean IoU35.75Unverified
PASCAL-5i (1-Shot)ASR (ResNet-50)Mean IoU58.16Unverified
PASCAL-5i (1-Shot)ASR (VGG-16)Mean IoU55.66Unverified
PASCAL-5i (5-Shot)ASR (ResNet-50)Mean IoU60.96Unverified
PASCAL-5i (5-Shot)ASR (VGG-16)Mean IoU57.99Unverified

Reproductions