Book contents
- Frontmatter
- Contents
- Editor's foreword
- Preface
- Part I Principles and elementary applications
- Part II Advanced applications
- 11 Discrete prior probabilities: the entropy principle
- 12 Ignorance priors and transformation groups
- 13 Decision theory, historical background
- 14 Simple applications of decision theory
- 15 Paradoxes of probability theory
- 16 Orthodox methods: historical background
- 17 Principles and pathology of orthodox statistics
- 18 The Ap distribution and rule of succession
- 19 Physical measurements
- 20 Model comparison
- 21 Outliers and robustness
- 22 Introduction to communication theory
- Appendix A Other approaches to probability theory
- Appendix B Mathematical formalities and style
- Appendix C Convolutions and cumulants
- References
- Bibliography
- Author index
- Subject index
22 - Introduction to communication theory
from Part II - Advanced applications
Published online by Cambridge University Press: 05 September 2012
- Frontmatter
- Contents
- Editor's foreword
- Preface
- Part I Principles and elementary applications
- Part II Advanced applications
- 11 Discrete prior probabilities: the entropy principle
- 12 Ignorance priors and transformation groups
- 13 Decision theory, historical background
- 14 Simple applications of decision theory
- 15 Paradoxes of probability theory
- 16 Orthodox methods: historical background
- 17 Principles and pathology of orthodox statistics
- 18 The Ap distribution and rule of succession
- 19 Physical measurements
- 20 Model comparison
- 21 Outliers and robustness
- 22 Introduction to communication theory
- Appendix A Other approaches to probability theory
- Appendix B Mathematical formalities and style
- Appendix C Convolutions and cumulants
- References
- Bibliography
- Author index
- Subject index
Summary
We noted in Chapter 11 that one of the motivations behind this work was the attempt to see Gibbsian statistical mechanics and Shannon's communication theory as examples of the same line of reasoning. A generalized form of statistical mechanics appeared as soon as we introduced the notion of entropy, and we ought now to be in a position to treat communication theory in a similar way.
One difference is that in statistical mechanics the prior information has nothing to do with frequencies (it consists of measured values of macroscopic quantities such as pressure), and so we have little temptation to commit errors. But in communication theory the prior information consists, typically, of frequencies; this makes the probability – frequency conceptual pitfalls much more acute. For this reason it seemed best to take up communication theory only after we had seen the general connections between probability and frequency, in a variety of conceptually simpler applications.
Origins of the theory
Firstly, the difficult matter of giving credit where credit is due. All major advances in understanding have their precursors, whose full significance is never recognized at the time. Relativity theory had them in the work of Mach, Fitzgerald, Lorentz and Poincaré, to mention only the most obvious examples. Communication theory had many precursors, in the work of Gibbs, Nyquist, Hartley, Szilard, von Neumann, and Wiener. But there is no denying that the work of Shannon (1948) represents the arrival of the main signal, just as did Einstein's of 1905.
- Type
- Chapter
- Information
- Probability TheoryThe Logic of Science, pp. 627 - 650Publisher: Cambridge University PressPrint publication year: 2003