A statistically consistent measure of Semantic Variability using Language Models
2025-02-01Unverified0· sign in to hype
Yi Liu
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
To address the issue of variability in the output generated by a language model, we present a measure of semantic variability that is statistically consistent under mild assumptions. This measure, denoted as semantic spectral entropy, is a easy to implement algorithm that requires just off the shelf language models. We put very few restrictions on the language models and we have shown in a clear simulation studies that such method can generate accurate metric despite randomness that arise from the language models.