STATISTICAL MECHANICS
The foundations of Statistical Mechanics from an entropic and Bayesian point of view is presented in the (still unfinished) book "Entropic Physics: Probability, Entropy, and the Foundations of Physics.”
Click here to go to My Book page.
The articles below discuss applications and other developments in this direction.
The conjecture that all variational principles in physics might be in one way or another a manifestation of maximum entropy is appealing. In the following papers we show that the maximum entropy method allows us to derive and then generalize the Bogoliubov variational principle, with applications to the structure of simple fluids and to density functional theory.
“Using relative entropy to find optimal approximations: An application to simple fluids” (with Chih-Yuan Tseng) Physica A 387, 6759 (2008).
“An entropic approach to classical Density Functional Theory” (with A. Yousefi) Phys. Sci. Forum 3, 13 (2021); arXiv:2108.01594.
My interest in the exact Renormalization Group (RG) as a systematic way to choose the variables that are relevant to a particular physics problem, whether it be in Statistical Mechanics or in Quantum Field Theory, goes back to my PhD thesis.
“Changes of variables and the renormalization group” PhD thesis, California Institute of Technology, May 1985.
“Changes of variables and the renormalization group” Caltech preprint CALT-68-1099 (1984); arXiv:1605.06366.
“A gauge covariant renormalization group” Caltech preprint CALT-68-1022 (1984).
More recently, the connection between the RG and Entropic Dynamics was explored in the following paper.
“Exact renormalization groups as a form of entropic dynamics” (with P. Pessoa) Entropy 20, 25 (2018); arXiv.org:1712.02267.
In some circles the Gibbs paradox remains somewhat controversial even though a perfectly acceptable explanation had already been offered by Gibbs himself. Once one realizes that entropy is not a property of the system, but that different entropies are assigned to different descriptions of the system, it is possible to formulate a simple explanation based on information theory.
“Yet another resolution of the Gibbs paradox: an information theory approach” (with Chi-Yuan Tseng) in Bayesian Inference and Maximum Entropy Methods in Science and Engineering (MaxEnt 2001), ed. by R. L. Fry, A.I.P. Vol. 617, 331 (2002); arXiv.org/abs/cond-mat/0109324.
The realization that entropy is not a “physical” concept, but a tool for inference, means that entropic methods can be applied to fields other than physics. In the paper below we constructed an entropic dynamics that might possibly apply to models of networks or perhaps ecologies.
“Entropic dynamics on Gibbs statistical manifolds” (with P. Pessoa and F. X. Costa) Entropy 23, 494 (2021); arXiv:2008.04683.
In the following paper with A. Golan we explored the use of MaxEnt to economies in equilibrium. This led to a number of interesting insights: even though the state of economic equilibrium is a maximum entropy state, there is no economic analogue of the second law of thermodynamics. Prices, which regulate the flow of goods, are Lagrange multipliers analogous to a temperature or a chemical potential. Perhaps most interesting from an economic point of view is that in this entropic approach to equilibrium no assumptions are made about the rationality of the agents participating in the economy.
“An Entropic framework for Modeling Economies” (with A. Golan) Physica A 408, 149 (2014).