Statistical NLP: Lecture 5

3/17/98


Click here to start


Table of Contents

Statistical NLP: Lecture 5

Entropy

Joint Entropy and Conditional Entropy

Mutual Information

The Noisy Channel Model

Relative Entropy or Kullback-Leibler Divergence

The Relation to Language: Cross-Entropy

The Entropy of English

Perplexity

Author: N & N

Email: nat@cs.dal.ca

Home Page: http://borg.cs.dal.ca/~nat