Loss Functions for Classification using Structured Entropy
Brian Lucena
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/numeristical/resourcesOfficialIn papernone★ 97
Abstract
Cross-entropy loss is the standard metric used to train classification models in deep learning and gradient boosting. It is well-known that this loss function fails to account for similarities between the different values of the target. We propose a generalization of entropy called structured entropy which uses a random partition to incorporate the structure of the target variable in a manner which retains many theoretical properties of standard entropy. We show that a structured cross-entropy loss yields better results on several classification problems where the target variable has an a priori known structure. The approach is simple, flexible, easily computable, and does not rely on a hierarchically defined notion of structure.