Mutual Information
By the chain rule for entropy, we have H(X,Y) = H(X)+ H(Y|X) = H(Y)+H(X|Y)
Therefore, H(X)-H(X|Y)=H(Y)-H(Y|X)
This difference is called the mutual information between X and Y.
It is the reduction in uncertainty of one random variable due to knowing about another, or, in other words, the amount of information one random variable contains about another.