The relationship between Perplexity and Entropy in NLP
Use Information Theory to understand NLP Metrics
Published in
10 min readJun 7, 2020
Perplexity is a common metric to use when evaluating language models. For example, scikit-learn’s implementation of Latent Dirichlet Allocation (a topic-modeling algorithm) includes perplexity as a built-in metric.