SOTAVerified

Maximally Separated Active Learning

2024-11-26Unverified0· sign in to hype

Tejaswi Kasarla, Abhishek Jha, Faye Tervoort, Rita Cucchiara, Pascal Mettes

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Active Learning aims to optimize performance while minimizing annotation costs by selecting the most informative samples from an unlabelled pool. Traditional uncertainty sampling often leads to sampling bias by choosing similar uncertain samples. We propose an active learning method that utilizes fixed equiangular hyperspherical points as class prototypes, ensuring consistent inter-class separation and robust feature representations. Our approach introduces Maximally Separated Active Learning (MSAL) for uncertainty sampling and a combined strategy (MSAL-D) for incorporating diversity. This method eliminates the need for costly clustering steps, while maintaining diversity through hyperspherical uniformity. We demonstrate strong performance over existing active learning techniques across five benchmark datasets, highlighting the method's effectiveness and integration ease. The code is available on GitHub.

Tasks

Reproductions