Shannon's entropy: an axiomatic approach
DOI:
https://doi.org/10.35819/remat2021v7i1id4756Keywords:
Axioms, Entropy, Communication, Theory of Shannon, Functional EquationsAbstract
The word "entropy" arose in the year 1864, in the works of thermodynamics by Rudolf Clausius. In 1948, Claude E. Shannon uses that same name to designate a measure of information in his mathematical model of communication, based on the concepts of emitter, receiver, channel, noise, redundancy, coding and decoding. With the measure of information H(X)=-C*sum_{i=1}^n[pi*log (pi)], the Shannon entropy, it becomes possible to analyze the capacity of the communication channel and invest in data processing, giving rise to what is currently called Information Theory. In addition to the operational and technological aspects of theory of Shannon, that reveal the digital age, from mathematical approaches about the formula H(X), also end up revealing a tendency focused on the characterization of information measures. It is understood that an exposure didactic of mathematical deduction of the formula from Shannon entropy, based on a group of axioms, not only is interesting in the pedagogical sense, but also for understanding theory of Shannon. It thereby show, that this formula is immersed in a well-defined mathematical context (a system with axioms and functional equations), allowing, with changes in the axioms, defining new measures of information.
Downloads
References
ACZÉL, J.; DARÓCZY, Z. On measures of information and their characterizations. v. 115, New York: Academic Press, 1975.
ACZÉL, J.; FORTE, B.; NG, C. T. Why the Shannon and Hartley entropies are "natural". Advances in Applied Probability, v. 6, n. 1, p. 131-146, 1974. DOI: https://doi.org/10.2307/1426210.
ACZÉL, J. Measuring information beyond communication theory: Some probably useful and some
almost certainly useless generalizations. Information processing & management, v. 20, n. 3, p. 383-395, 1984a.
ACZÉL, J. Measuring information beyond communication theory: Why some generalized information measures may be useful, others not. Survey paper. Aequationes Mathematicae, v. 27, p. 1-19, 1984b.
ASH, R. B. Information Theory. New York: Dover Publications, 1990.
BARTLE, R. G.; SHERBERT, D. R. Introduction to Real Analysis. 4. ed, New York: John Wiley & Sons, 2011.
CLAUSIUS, R. Abhandlungen über die mechanische Wärmetheorie. Braunschweig: Druck Und Verlag Von Friedrich Vieweg Und Sohn, 1864.
COVER, T. M.; THOMAS, J. A. Elements of Information Theory. 1. ed. John Wiley & Sons, 1991, 2. ed., NJ, 2006.
EBANKS, B.; SAHOO, P.; SANDER, W. Characterizations of Information Measures. Singapura: World Scientific, 1998.
FADDEEV, D. K. On the Concept of Entropy of a Finite Probability Scheme. Originalmente publicado em
Russo em Uspekhi Matematicheskikh Nauk, v. 11, n. 1 (67), p. 227-231, 1956.
INGARDEN, R. S.; URBANIK, K. Information without probability. Colloquium Mathematicum, v. IX, p. 132-150, 1962.
KHINCHIN, A. I. Mathematical Foundations of Information Theory. Trad. R. A. Silverman; M. D.
Friedman. New York: Dover Publications, 1957.
MAGOSSI, J. C.; PAVIOTTI, J. R. Incerteza em Entropia. Revista Brasileira de História da Ciência,
Rio de Janeiro, v. 12, n. 1, p. 84-96, jan/jun 2019.
PIERCE, J. R. An Introduction to Information Theory - Symbols, Signals and Noise. New York: Dover Publications, 1961.
REZA, F. M. An Introduction to Information Theory. New York: McGraw-Hill, 1961.
RIOUL, O. Teoria da Informação e da Codificação. Trad. José Carlos Magossi. Campinas: Editora da Unicamp, 2018.
SHANNON, C. E. A Mathematical Theory of Communication. The Bell System Technical Journal, v. 27, p. 379-423, p. 623-656, jul. 1948.
SHANNON, C. E.; WEAVER, W. The Mathematical Theory of Communication. Urbana, Illinois: University of Illinois Press, 1949.
Downloads
Published
Issue
Section
License
Copyright (c) 2021 REMAT: Revista Eletrônica da Matemática
This work is licensed under a Creative Commons Attribution 4.0 International License.
REMAT retains the copyright of published articles, having the right to first publication of the work, mention of first publication in the journal in other published media and distribution of parts or of the work as a whole in order to promote the magazine.
This is an open access journal, which means that all content is available free of charge, at no cost to the user or his institution. Users are permitted to read, download, copy, distribute, print, search or link the full texts of the articles, or use them for any other legal purpose, without requesting prior permission from the magazine or the author. This statement is in accordance with the BOAI definition of open access.