Entropy is the average amount of information produced by a source. It is also the minimum number of bits required, on average, to encode the source without losing any information.
Mathematically, the information content ( h(x) ) of an event ( x ) with probability ( p ) is: Introduction To Coding And Information Theory Steven Roman
[ H = -\sum_{i=1}^{n} p_i \log_2(p_i) ]