PRINCIPIA CYBERNETICA WEB - ©


Parent Node(s):


INFORMATION THEORY

(or statistical communication theory) a calculus or variation, variability and variance initially developed by Shannon, to separate noise from information carrying signals, now used to trace the flow of information in complex systems, to decompose a system into independent (see independence) or semi-independent sub-systemS, to evaluate the efficiency of communication channels and of various communication codes (see redundancy, noise, equivocation) and to compare information needs with the capacities of existing information processors (see computer, bremermann's limit), etc. The basic quantity this calculus analyses (see analysis) is the total amount of statistical entropy given data contain about an observed system. The calculus provides an algebra for decomposing and thus accounting for this entropy in numerous ways. E.g. the quantity of entropy in an observed system equals the sum of the entropies in all of its separate parts minus that amount of information transmitted within the system. The latter quantity is the amount of entropy in a system not explainable from its parts and an expression of the communication between these parts. This formula is another example of the cybernetic analysis of systems, according to which any whole system is accounted for or defined in terms of a set of components and its organization. The total amount of information transmitted in a quantitative analogue to and hence can be thought of as a measure of a system's structure. Information theory has provided numerous theorems and algebraical identities with which observed systems may be approached, e.g., the law of requisite variety, the TENTH theorem of information theory (Krippendorff)
* Next * Previous * Index * Search * Help

URL= http://cleamc11.vub.ac.be/ASC/INFORM_THEOR.html