top of page

PROBABILITY, ENTROPY, AND INFORMATION GEOMETRY

Instead of the more traditional view of physics as a direct description of reality, our goal is to formulate physics as a set of models or tools to make probabilistic inferences about reality. The development of such an “Entropic Physics” requires the development of Bayesian and entropic methods in order to clarify the interpretation of the concepts of probability, information, and entropy. Most of this material is presented in the (still unfinished) book "Entropic Physics: Probability, Entropy, and the Foundations of Physics.”

Click here to go to My Book page.

The following are recent reviews on the conceptual foundations of entropy and information.

 

“Entropy, Information, and the Updating of Probabilities” special volume on Statistical Foundations of Entropy, ed. by P. Jizba and J. Korbel, Entropy 23, 895 (2021); arXiv.org:2107.04529.

 

“Towards an Informational Pragmatic Realism” in the special issue on “Luciano Floridi and the Philosophy of Information”, Mind and Machines 24, 37-70 (2014); arXiv.org:1412.5644.

 

This is a brief tutorial about information geometry:

 

“The Basics of Information Geometry” in Bayesian Inference and Maximum Entropy Methods in Science and Engineering (MaxEnt 2014), ed. by A. Mohammad-Djafari and F, Barbaresco, AIP Conf. Proc. 1641, 15 (2015); arXiv.org:1412.5633. 

 

The pragmatic design of probability as a tool to reason with incomplete information is explored in the following papers.

 

“Towards an Informational Pragmatic Realism” in the special issue on “Luciano Floridi and the Philosophy of Information”, Mind and Machines 24, 37-70 (2014); arXiv.org:1412.5644.

 

“Quantifying Rational Belief” in Bayesian Inference and Maximum Entropy Methods in Science and Engineering (MaxEnt 2009), ed. by P. Goggans et al., AIP Conf. Proc. 1193, 60 (2009); arXiv:0908.3212.

 

At MaxEnt 2006, we presented a derivation of (relative) entropy as a tool for updating probabilities. Although the derivation has been streamlined considerably, the main ideas remain valid to this day. A particularly significant contribution was the demonstration that the Method of Maximum (relative) Entropy includes as special cases both classical MaxEnt and the Bayes updating rule. Thus, Bayesian and entropic methods were effectively unified into a single scheme which settled the issue of their compatibility.

 

“Updating Probabilities” (with A. Giffin) in Bayesian Inference and Maximum Entropy Methods in Science and Engineering (MaxEnt 2006), ed. by A. Mohammad-Djafari, AIP Conf. Proc. Vol. 872, 31 (2007); arXiv.org/abs/physics/0608185.

 

At MaxEnt 2007, the entropic framework was further strengthened with the pragmatic interpretation of information in terms of its effects on the epistemological concerns of ideally rational agents. What is noteworthy here is that it was no longer necessary to think in terms of amounts of information but rather in terms of the effects of information.  

 

“Information and Entropy” in Bayesian Inference and Maximum Entropy Methods in Science and Engineering (MaxEnt 2007), ed. by K. Knuth et al., AIP Conf. Proc. Vol. 954, 11 (2007); arXiv:0710.1068.

 

The framework of entropic inference was further developed in the following papers and tutorials presented at the MaxEnt conferences. 

 

“Entropic inference: some pitfalls and paradoxes we can avoid”, in Bayesian Inference and Maximum Entropy Methods in Science and Engineering (MaxEnt 2012), ed. by U. von Toussaint et al., AIP Conf. Proc. 1553, 200 (2013); arXiv: 1212.6967.

 

“Entropic Inference” in Bayesian Inference and Maximum Entropy Methods in Science and Engineering (MaxEnt 2010), ed. by A. Mohammad-Djafari, et al., AIP Conf. Proc. 1305, 20 (2010); arXiv:1011.0723.

 

“Updating Probabilities with Data and Moments” (with A. Giffin) in Bayesian Inference and Maximum Entropy Methods in Science and Engineering (MaxEnt 2007), ed. by K. Knuth et al., AIP Conf. Proc. Vol. 954, 74 (2007); arXiv:0708.1593.

 

“Relative entropy and inductive inference” in Bayesian Inference and Maximum Entropy Methods in Science and Engineering (MaxEnt 2003), ed. by G. Erickson and Y. Zhai, A.I.P. Vol. 707, 75 (2004); arXiv.org/abs/physics/0311093. 

 

“Maximum Entropy, fluctuations, and priors” in Maximum Entropy and Bayesian Methods in Science and Engineering (MaxEnt 2000), ed. by A. Mohammad-Djafari,  A.I.P. Vol. 568, 94 (2001); arXiv.org/abs/math-ph/0008017.

 

The use of entropy to assign prior distributions—the entropic priors—is explored in the following papers:

 

“Maximum Entropy and Bayesian Data Analysis: Entropic Prior Distributions” (with R. Preuss) Phys. Rev. E 70, 046127 (2004); arXiv.org/abs/physics/0307055.

 

“Maximum Entropy, fluctuations, and priors” in Maximum Entropy and Bayesian Methods in Science and Engineering (MaxEnt 2000), ed. by A. Mohammad-Djafari, A.I.P. Vol. 568, 94 (2001); arXiv.org/abs/math-ph/0008017.

 

“What is a question?” A probability distribution reflects incomplete information and, therefore, it is an implicit request for information. The idea that a question is a probability distribution is explored in the paper below.

 

“Questions, Relevance, and Relative Entropy” in Bayesian Inference and Maximum Entropy Methods in Science and Engineering (MaxEnt 2004), ed. by V. Dose, R. Fischer, R. Preuss and U. von Toussaint, AIP Conf. Proc., 735, 429 (2004); arXiv.org/abs/cond-mat/0409175.

bottom of page