SOTAVerified

Body Part-Based Representation Learning for Occluded Person Re-Identification

2022-11-07Code Available2· sign in to hype

Vladimir Somers, Christophe De Vleeschouwer, Alexandre Alahi

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Occluded person re-identification (ReID) is a person retrieval task which aims at matching occluded person images with holistic ones. For addressing occluded ReID, part-based methods have been shown beneficial as they offer fine-grained information and are well suited to represent partially visible human bodies. However, training a part-based model is a challenging task for two reasons. Firstly, individual body part appearance is not as discriminative as global appearance (two distinct IDs might have the same local appearance), this means standard ReID training objectives using identity labels are not adapted to local feature learning. Secondly, ReID datasets are not provided with human topographical annotations. In this work, we propose BPBreID, a body part-based ReID model for solving the above issues. We first design two modules for predicting body part attention maps and producing body part-based features of the ReID target. We then propose GiLt, a novel training scheme for learning part-based representations that is robust to occlusions and non-discriminative local appearance. Extensive experiments on popular holistic and occluded datasets show the effectiveness of our proposed method, which outperforms state-of-the-art methods by 0.7% mAP and 5.6% rank-1 accuracy on the challenging Occluded-Duke dataset. Our code is available at https://github.com/VlSomers/bpbreid.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
DukeMTMC-reIDBPBreIDmAP84.2Unverified
DukeMTMC-reIDBPBreID (RK)mAP92.9Unverified
Market-1501BPBreIDRank-195.7Unverified
Market-1501BPBreID (RK)Rank-196.4Unverified
Occluded-DukeMTMCBPBreID Rank-175.1Unverified
Occluded REIDBPBreIDmAP75.2Unverified
P-DukeMTMC-reIDBPBreIDmAP83.2Unverified

Reproductions