^zhurnaly   -   Random   -   Recent   -   Running   -   Mantra   -   Tarot   -   Help


E.T. Jaynes united Bayesian Probability Theory with Information Theory.

From a brief biography:

Ed's vastly influential synthesis of the ideas and results of Laplace, Bayes, Jeffreys, Cox, and Shannon into a consistent modern framework of probabilistic reasoning is a natural outgrowth of this striking early work. A series of beautifully composed and argued papers on this subject and on information-theoretic statistical mechanics was published in conference proceedings volumes -- it was commonplace for mainstream journals to reject Ed's manuscripts. Among these is the classic "How the Brain Does Plausible Reasoning," originally a 1959 Stanford Microwave Laboratory Report. Jaynes' impact on the field of statistical inference has been enormous and has been summarized in Probability & Physics: Essays in Honor of Edwin T. Jaynes, edited by W. T. Grandy, Jr. and P. W. Millonni and published by Cambridge University Press in 1993. Ed left us a virtually complete book manuscript entitled Probability Theory: The Logic of Science, a monumental contribution which, on-line and in preprint form, has already become one of the most widely studied books in science. His writings expose the foundations of "the calculus of inductive reasoning" with a clarity and elegance that will continue to enlighten and delight his readers for many generations to come.

Jaynes is probably best known for his contribution to the Principle of Maximum Entropy. Writing first in the Physical Review in 1957, he had decades in which to simplify and clarify the arguments when he summarized them in 1982. Apply the Stirling approximation to the factorials you get calculating multiplicities and voila you get Shannon's familiar -sum (p log p). One sees readily why an overwhelming fraction of outcomes are the high entropy outcomes, and therefore why, for example, virtually every common probability distribution, such as the uniform, exponential, geometric, normal, etc., is the unique maximum entropy member of some class of distributions, subject to some constraint. And the use of Lagrange Multipliers generally offers a very straightforward way to calculate such solutions.

As a physicist (whose advisor was Wigner), Jaynes was particularly obsessed with widely held conceptual blunders. From the same biography linked above:

Ed insisted that some of the thorniest conceptual problems faced in physics, notably in statistical physics and quantum theory, arise from a mistaken identification of probabilities as physical quantities rather than as representations of the available information on a system -- a confusion between what is ontological and what is epistemological. Like Einstein, he was repelled by the Copenhagen interpretation of quantum mechanics and what he viewed as an incursion of mysticism into science.

For a remarkably lucid look at three such thorny problems in physics, see the first 16 pages of his 1988 paper Clearing up Mysteries: The Original Goal.


Published works.

Probability Theory: The Logic of Science as a single pdf file.

The entry on E.T. Jaynes at Wikipedia.