Neural collapse in the orthoplex regime
James Alcala, Rayna Andreeva, Vladimir A. Kobzar, Dustin G. Mixon, Sanghoon Na, Shashank Sule, Yangxinyu Xie
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
When training a neural network for classification, the feature vectors of the training set are known to collapse to the vertices of a regular simplex, provided the dimension d of the feature space and the number n of classes satisfies n d+1. This phenomenon is known as neural collapse. For other applications like language models, one instead takes n d. Here, the neural collapse phenomenon still occurs, but with different emergent geometric figures. We characterize these geometric figures in the orthoplex regime where d+2 n 2d. The techniques in our analysis primarily involve Radon's theorem and convexity.