SOTAVerified

Entropy analysis of word-length series of natural language texts: Effects of text language and genre

2014-01-17Unverified0· sign in to hype

Maria Kalimeri, Vassilios Constantoudis, Constantinos Papadimitriou, Kostantinos Karamanos, Fotis K. Diakonos, Haris Papageorgiou

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We estimate the n-gram entropies of natural language texts in word-length representation and find that these are sensitive to text language and genre. We attribute this sensitivity to changes in the probability distribution of the lengths of single words and emphasize the crucial role of the uniformity of probabilities of having words with length between five and ten. Furthermore, comparison with the entropies of shuffled data reveals the impact of word length correlations on the estimated n-gram entropies.

Tasks

Reproductions