http://mathworld.i2p/Entropy.html
The (Shannon) entropy of a variable is defined as bits, where is the probability that is in the state , and is defined as 0 if . The joint entropy of variables , ..., is then defined by See also Differential Entropy , Information Theory , Kolmogorov Entropy , Maximum Entropy Method , Metric Entropy , Mutual Information , Nat , Ornstein's Theorem , Redundancy , Relative Entropy ,...