EMC^2: Efficient MCMC Negative Sampling for Contrastive Learning with Global Convergence
Chung-Yiu Yau, Hoi-To Wai, Parameswaran Raman, Soumajyoti Sarkar, Mingyi Hong
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/amazon-science/contrastive_emc2Officialpytorch★ 2
Abstract
A key challenge in contrastive learning is to generate negative samples from a large sample set to contrast with positive samples, for learning better encoding of the data. These negative samples often follow a softmax distribution which are dynamically updated during the training process. However, sampling from this distribution is non-trivial due to the high computational costs in computing the partition function. In this paper, we propose an Efficient Markov Chain Monte Carlo negative sampling method for Contrastive learning (EMC^2). We follow the global contrastive learning loss as introduced in SogCLR, and propose EMC^2 which utilizes an adaptive Metropolis-Hastings subroutine to generate hardness-aware negative samples in an online fashion during the optimization. We prove that EMC^2 finds an O(1/T)-stationary point of the global contrastive loss in T iterations. Compared to prior works, EMC^2 is the first algorithm that exhibits global convergence (to stationarity) regardless of the choice of batch size while exhibiting low computation and memory cost. Numerical experiments validate that EMC^2 is effective with small batch training and achieves comparable or better performance than baseline algorithms. We report the results for pre-training image encoders on STL-10 and Imagenet-100.