Relative Entropy or Kullback-Leibler Divergence
For 2 pmfs, p(x) and q(x), their relative entropy is:
D(p||q)= ?x?X p(x)log(p(x)/q(x))
The relative entropy (also known as the Kullback-Leibler divergence) is a measure of how different two probability distributions (over the same event space) are.
The KL divergence between p and q can also be seen as the average number of bits that are wasted by encoding events from a distribution p with a code based on a not-quite-right distribution q.