Organodynamics | Grant
Holland, Apr 25, 2014 |
Slide: What is Information Theory? | |
The
perspective on information
theory taken by organodynamics: A branch of probability theory that characterizes the uncertainty of chance variation through the development and application of entropic
functionals: Entropy, joint entropy, conditional entropy, relative entropy, mutual information, entropy rateÉ ÒThe central idea of information theory is to measure
the uncertainty associated with random variablesÓ [in the loose sense]. [Kleeman
2009, Lecture 1, p. 1] | Information
theory is
often confused with communications theory.
And the two terms are often used interchangeably. [Pierce 1980, Preface, p. vii]. This
is an unfortunate misunderstanding. 1.
We
donÕt need two names for the same theory. 2.
And,
we also need a name for the study of entropic functionals. Proposal:
Adopt the name information theory to mean the study of entropy and statistical entropic functionals. Some
mathematicians already do, e.g. [Kleeman 2012, lecture 1, p. 1] |
Shannon
[Shannon 1948] invented communication theory and specifies that Òthe fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point.Ó And
that a communications system consists of five parts: Information source, transmitter, channel, receiver and destination. Clearly,
communication
theory is part of electrical
engineering or computer science, while information
theory is a branch of mathematics
(probability theory). | According
to this view: Confusing information theory with communications
theory is like confusing calculus with Newtonian
mechanics. Mechanics
is an application of calculus; and communications
theory is an application of information
theory. ÒInformation
theory is applicable to any situation that has probabilities.Ó
[Jaynes 1972] <grant Ð verify ref> |
Notes: