Blog

What does perplexity mean in NLP?

What is perplexity in machine learning?

  • Perplexity measures how well a probability model predicts the test data. The model that assigns a higher probability to the test data is the better model. [A good model will assign a high probability to a real sentence]

What is perplexity and why does it matter?

  • Perplexity is a metric used to judge how good a language model is We can define perplexity as the inverse probability of the test set, normalised by the number of words:

What is the perplexity of a test set?

  • For a test set W = w1, w2, …, wN, the perplexity is the probability of the test set, normalized by the number of words: This equation can be modified to accommodate the language model that we use. For example, if we use a bigram language model, then the equation can be modified as follows;

image-What does perplexity mean in NLP?
image-What does perplexity mean in NLP?
Share this Post: