PRINCIPIA CYBERNETICA WEB - ©


Parent Node(s):


STATISTICAL ENTROPY

A measure or variation or diversity defined on the probability distribution of observed events. Specifically, if P is the probability of an event a, the entropy H(A) for all events a in A is:

H(A) = -SUM_a P_a log_2 P_a

The quantity is zero when all events are of the same kind, p =1 for any one a of A and is positive otherwise. Its upper limit is log_2 N where N is the number of categories available (see degrees of freedom) and the distribution is uniform over these, p = 1/N for all a of A (see variety, uncertainty, negentropy). The statistical entropy measure is the most basic measure of information theory. (Krippendorff)


* Next * Previous * Index * Search * Help

URL= http://cleamc11.vub.ac.be/ASC/STATIS_ENTRO.html