SOTAVerified

Fast and close Shannon entropy approximation

2025-05-20Unverified0· sign in to hype

Illia Horenko, Davide Bassetti, Lukáš Pospíšil

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Shannon entropy (SE) and its quantum mechanical analogue von Neumann entropy are key components in many tools used in physics, information theory, machine learning (ML) and quantum computing. Besides of the significant amounts of SE computations required in these fields, the singularity of the SE gradient is one of the central mathematical reason inducing the high cost, frequently low robustness and slow convergence of such tools. Here we propose the Fast Entropy Approximation (FEA) - a non-singular rational approximation of Shannon entropy and its gradient that achieves a mean absolute error of 10^-3, which is approximately 20 times lower than comparable state-of-the-art methods. FEA allows around 50\% faster computation, requiring only 5 to 6 elementary computational operations, as compared to tens of elementary operations behind the fastest entropy computation algorithms with table look-ups, bitshifts, or series approximations. On a set of common benchmarks for the feature selection problem in machine learning, we show that the combined effect of fewer elementary operations, low approximation error, and a non-singular gradient allows significantly better model quality and enables ML feature extraction that is two to three orders of magnitude faster and computationally cheaper when incorporating FEA into AI tools.

Tasks

Reproductions