SOTAVerified

Improved Baselines with Momentum Contrastive Learning

2020-03-09Code Available1· sign in to hype

Xinlei Chen, Haoqi Fan, Ross Girshick, Kaiming He

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Contrastive unsupervised learning has recently shown encouraging progress, e.g., in Momentum Contrast (MoCo) and SimCLR. In this note, we verify the effectiveness of two of SimCLR's design improvements by implementing them in the MoCo framework. With simple modifications to MoCo---namely, using an MLP projection head and more data augmentation---we establish stronger baselines that outperform SimCLR and do not require large training batches. We hope this will make state-of-the-art unsupervised learning research more accessible. Code will be made public.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
Places205MoCo v2Top 1 Accuracy52.9Unverified

Reproductions