Entropy
The entropy is the average uncertainty of a single random variable.
Let p(x)=P(X=x); where x ?X.
H(p)= H(X)= - ?x?X p(x)log2p(x)
In other words, entropy measures the amount of information in a random variable. It is normally measured in bits.