SOTAVerified

Adaptive Prototype Learning and Allocation for Few-Shot Segmentation

2021-04-05CVPR 2021Code Available1· sign in to hype

Gen Li, Varun Jampani, Laura Sevilla-Lara, Deqing Sun, Jonghyun Kim, Joongkyu Kim

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Prototype learning is extensively used for few-shot segmentation. Typically, a single prototype is obtained from the support feature by averaging the global object information. However, using one prototype to represent all the information may lead to ambiguities. In this paper, we propose two novel modules, named superpixel-guided clustering (SGC) and guided prototype allocation (GPA), for multiple prototype extraction and allocation. Specifically, SGC is a parameter-free and training-free approach, which extracts more representative prototypes by aggregating similar feature vectors, while GPA is able to select matched prototypes to provide more accurate guidance. By integrating the SGC and GPA together, we propose the Adaptive Superpixel-guided Network (ASGNet), which is a lightweight model and adapts to object scale and shape variation. In addition, our network can easily generalize to k-shot segmentation with substantial improvement and no additional computational cost. In particular, our evaluations on COCO demonstrate that ASGNet surpasses the state-of-the-art method by 5% in 5-shot segmentation.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
COCO-20i (1-shot)ASGNet (ResNet-50)Mean IoU34.56Unverified
COCO-20i (5-shot)ASGNet (ResNet-50)Mean IoU42.48Unverified
PASCAL-5i (1-Shot)ASGNet (ResNet-101)Mean IoU59.31Unverified
PASCAL-5i (1-Shot)ASGNet (ResNet-50)Mean IoU59.29Unverified
PASCAL-5i (5-Shot)ASGNet (ResNet-101)Mean IoU64.36Unverified
PASCAL-5i (5-Shot)ASGNet (ResNet-50)Mean IoU63.94Unverified

Reproductions