SOTAVerified

AdaCap: An Adaptive Contrastive Approach for Small-Data Neural Networks

2025-11-25Code Available0· sign in to hype

Bruno Belucci, Karim Lounici, Katia Meziani

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Neural networks struggle on small tabular datasets, where tree-based models remain dominant. We introduce Adaptive Contrastive Approach (AdaCap), a training scheme that combines a permutation-based contrastive loss with a Tikhonov-based closed-form output mapping. Across 85 real-world regression datasets and multiple architectures, AdaCap yields consistent and statistically significant improvements in the small-sample regime, particularly for residual models. A meta-predictor trained on dataset characteristics (size, skewness, noise) accurately anticipates when AdaCap is beneficial. These results show that AdaCap acts as a targeted regularization mechanism, strengthening neural networks precisely where they are most fragile. All results and code are publicly available at https://github.com/BrunoBelucci/adacap.

Reproductions