Shannon's entropy: an axiomatic approach

Authors

  • José Carlos Magossi Universidade Estadual de Campinas (UNICAMP), Faculdade de Tecnologia (FT), Divisao de Telecomunicações, Limeira, SP, Brasil https://orcid.org/0000-0001-8985-6507
  • Antônio César da Costa Barros Universidade Estadual de Campinas (UNICAMP), Faculdade de Tecnologia (FT), Doutorando no Programa de Pós-Graduação em Tecnologia (PPGT), Limeira, SP, Brasil https://orcid.org/0000-0002-4822-9459

DOI:

https://doi.org/10.35819/remat2021v7i1id4756

Keywords:

Axioms, Entropy, Communication, Theory of Shannon, Functional Equations

Abstract

The word "entropy" arose in the year 1864, in the works of thermodynamics by Rudolf Clausius. In 1948, Claude E. Shannon uses that same name to designate a measure of information in his mathematical model of communication, based on the concepts of emitter, receiver, channel, noise, redundancy, coding and decoding. With the measure of information H(X)=-C*sum_{i=1}^n[pi*log (pi)], the Shannon entropy, it becomes possible to analyze the capacity of the communication channel and invest in data processing, giving rise to what is currently called Information Theory. In addition to the operational and technological aspects of theory of Shannon, that reveal the digital age, from mathematical approaches about the formula H(X), also end up revealing a tendency focused on the characterization of information measures. It is understood that an exposure didactic of mathematical deduction of the formula from Shannon entropy, based on a group of axioms, not only is interesting in the pedagogical sense, but also for understanding theory of Shannon. It thereby show, that this formula is immersed in a well-defined mathematical context (a system with axioms and functional equations), allowing, with changes in the axioms, defining new measures of information.

Downloads

Download data is not yet available.

Author Biographies

  • José Carlos Magossi, Universidade Estadual de Campinas (UNICAMP), Faculdade de Tecnologia (FT), Divisao de Telecomunicações, Limeira, SP, Brasil
  • Antônio César da Costa Barros, Universidade Estadual de Campinas (UNICAMP), Faculdade de Tecnologia (FT), Doutorando no Programa de Pós-Graduação em Tecnologia (PPGT), Limeira, SP, Brasil

References

ACZÉL, J.; DARÓCZY, Z. On measures of information and their characterizations. v. 115, New York: Academic Press, 1975.

ACZÉL, J.; FORTE, B.; NG, C. T. Why the Shannon and Hartley entropies are "natural". Advances in Applied Probability, v. 6, n. 1, p. 131-146, 1974. DOI: https://doi.org/10.2307/1426210.

ACZÉL, J. Measuring information beyond communication theory: Some probably useful and some

almost certainly useless generalizations. Information processing & management, v. 20, n. 3, p. 383-395, 1984a.

ACZÉL, J. Measuring information beyond communication theory: Why some generalized information measures may be useful, others not. Survey paper. Aequationes Mathematicae, v. 27, p. 1-19, 1984b.

ASH, R. B. Information Theory. New York: Dover Publications, 1990.

BARTLE, R. G.; SHERBERT, D. R. Introduction to Real Analysis. 4. ed, New York: John Wiley & Sons, 2011.

CLAUSIUS, R. Abhandlungen über die mechanische Wärmetheorie. Braunschweig: Druck Und Verlag Von Friedrich Vieweg Und Sohn, 1864.

COVER, T. M.; THOMAS, J. A. Elements of Information Theory. 1. ed. John Wiley & Sons, 1991, 2. ed., NJ, 2006.

EBANKS, B.; SAHOO, P.; SANDER, W. Characterizations of Information Measures. Singapura: World Scientific, 1998.

FADDEEV, D. K. On the Concept of Entropy of a Finite Probability Scheme. Originalmente publicado em

Russo em Uspekhi Matematicheskikh Nauk, v. 11, n. 1 (67), p. 227-231, 1956.

INGARDEN, R. S.; URBANIK, K. Information without probability. Colloquium Mathematicum, v. IX, p. 132-150, 1962.

KHINCHIN, A. I. Mathematical Foundations of Information Theory. Trad. R. A. Silverman; M. D.

Friedman. New York: Dover Publications, 1957.

MAGOSSI, J. C.; PAVIOTTI, J. R. Incerteza em Entropia. Revista Brasileira de História da Ciência,

Rio de Janeiro, v. 12, n. 1, p. 84-96, jan/jun 2019.

PIERCE, J. R. An Introduction to Information Theory - Symbols, Signals and Noise. New York: Dover Publications, 1961.

REZA, F. M. An Introduction to Information Theory. New York: McGraw-Hill, 1961.

RIOUL, O. Teoria da Informação e da Codificação. Trad. José Carlos Magossi. Campinas: Editora da Unicamp, 2018.

SHANNON, C. E. A Mathematical Theory of Communication. The Bell System Technical Journal, v. 27, p. 379-423, p. 623-656, jul. 1948.

SHANNON, C. E.; WEAVER, W. The Mathematical Theory of Communication. Urbana, Illinois: University of Illinois Press, 1949.

Published

2021-05-26

Issue

Section

Mathematics

How to Cite

Shannon’s entropy: an axiomatic approach. REMAT: Revista Eletrônica da Matemática, Bento Gonçalves, RS, v. 7, n. 1, p. e3013, 2021. DOI: 10.35819/remat2021v7i1id4756. Disponível em: https://periodicos.ifrs.edu.br/index.php/REMAT/article/view/4756.. Acesso em: 19 nov. 2024.

Similar Articles

21-30 of 286

You may also start an advanced similarity search for this article.