Preprint / Version 1

Natural Attractive Interactions and the Contraction of Information Radius

##article.authors##

DOI:

https://doi.org/10.51094/jxiv.2033

Keywords:

mutual information, information distance, pointwise mutual information, canonical Gibbs distributions, stochastic dynamics, contraction

Abstract

Mutual information MI(X_t; Y_t) describes how knowledge of one subsystem reduces the uncertainty of another. We study the time evolution of the information distance
r(t) = H(X_t) + H(Y_t) - 2·MI(X_t; Y_t),
defined in terms of the marginal entropies and the mutual information of a pair of interacting variables (X_t, Y_t). The joint distribution evolves according to a continuity equation with velocity fields v_x and v_y. Under standard smoothness and no-flux assumptions, we derive variational identities for the time derivatives of the marginal entropies and the mutual information, and combine them into a master identity for the rate of change of r(t) expressed through the velocity fields and the pointwise mutual information (PMI),
φ = log p - log p_X - log p_Y.

For canonical Gibbs models with density proportional to
exp{−β[U(x) + U(y) + W(x, y)]},
we show that the gradients of φ coincide with centered interaction forces and are orthogonal—under the L2(p) inner product—to all marginal-only drift fields. This leads to a natural decomposition of the velocity field into a PMI-gradient component and a marginal drift component, written as
v = γ·∇φ + u.
When the dynamics is driven purely by PMI gradients and the marginals are preserved, we obtain an exact contraction theorem: the information distance r(t) decreases at a rate proportional to a PMI-based “Fisher energy.” When marginal drifts are present, we derive a quantitative inequality showing that r(t) still decreases whenever the PMI-gradient energy dominates the entropy production induced by the marginal drifts. Gaussian and coupled Ornstein–Uhlenbeck examples illustrate these mechanisms and provide explicit parameter ranges under which contraction holds.

Conflicts of Interest Disclosure

The authors declare no potential conflict of interests.

Downloads *Displays the aggregated results up to the previous day.

Download data is not yet available.

References

C. E. Shannon, “A Mathematical Theory of Communication,” Bell System Technical Journal, vol. 27, pp. 379–423, 623–656, 1948. https://ieeexplore.ieee.org/document/6773024

T. M. Cover and J. A. Thomas, Elements of Information Theory, 2nd ed., Wiley, 2006. https://onlinelibrary.wiley.com/doi/book/10.1002/047174882X

R. M. Gray, Entropy and Information Theory, 2nd ed., Springer, 2011. https://link.springer.com/book/10.1007/978-1-4419-7970-4

I. Csiszár and P. C. Shields, “Information Theory and Statistics: A Tutorial,” Foundations and Trends in Communications and Information Theory, vol. 1, 2004. https://www.nowpublishers.com/article/Details/CIT-004

S. Amari and H. Nagaoka, Methods of Information Geometry, Amer. Math. Soc., 2000. https://www.amazon.com/Methods-Information-Geometry-Monographs-Statistics/dp/0821843028

S. Amari, Information Geometry and Its Applications, Springer, 2016. https://link.springer.com/book/10.1007/978-4-431-55978-8

C. Villani, Topics in Optimal Transportation, Amer. Math. Soc., 2003. https://bookstore.ams.org/gsm-58

P. Harremoës and F. Topsøe, “Inequalities between Entropy and Divergence,” IEEE Transactions on Information Theory, 47(4), pp. 1506–1516, 2001. https://doi.org/10.1109/18.923723

E. T. Jaynes, “Information Theory and Statistical Mechanics,” Physical Review, 106, pp. 620–630, 1957. https://doi.org/10.1103/PhysRev.106.620

G. E. Uhlenbeck and L. S. Ornstein, “On the Theory of the Brownian Motion,” Physical Review, 36, pp. 823–841, 1930. https://doi.org/10.1103/PhysRev.36.823

J. L. Doob, “The Brownian Movement and Stochastic Equations,” Annals of Mathematics, 43(2), pp. 351–369, 1942. https://doi.org/10.2307/1968873

H. Risken, The Fokker–Planck Equation: Methods of Solution and Applications, 2nd ed., Springer, 1996. https://link.springer.com/book/10.1007/978-3-642-61544-3

C. W. Gardiner, Stochastic Methods: A Handbook for the Natural and Social Sciences, 4th ed., Springer, 2009. https://link.springer.com/book/10.1007/978-3-540-70713-2

C. Jarzynski, “Nonequilibrium Equality for Free Energy Differences,” Physical Review Letters, 78, pp. 2690–2693, 1997. https://doi.org/10.1103/PhysRevLett.78.2690

G. E. Crooks, “Entropy Production Fluctuation Theorem and the Nonequilibrium Work Relation for Free Energy Differences,” Physical Review E, 60, pp. 2721–2726, 1999. https://doi.org/10.1103/PhysRevE.60.2721

J. M. R. Parrondo, J. M. Horowitz, and T. S. Sagawa, “Thermodynamics of Information,” Nature Physics, 11, pp. 131–139, 2015. https://doi.org/10.1038/nphys3230

T. Sagawa and M. Ueda, “Second Law of Thermodynamics with Discrete Quantum Feedback Control,” Physical Review Letters, 100:080403, 2008. https://doi.org/10.1103/PhysRevLett.100.080403

J. M. Horowitz and M. Esposito, “Thermodynamics with Continuous Information Flow,” Physical Review X, 4:031015, 2014. https://doi.org/10.1103/PhysRevX.4.031015

S. Still, D. A. Sivak, A. J. Bell, and G. E. Crooks, “Thermodynamics of Prediction,” Physical Review Letters, 109:120604, 2012. https://doi.org/10.1103/PhysRevLett.109.120604

G. Ruppeiner, “Riemannian Geometry in Thermodynamic Fluctuation Theory,” Reviews of Modern Physics, 67, pp. 605–659, 1995. https://doi.org/10.1103/RevModPhys.67.605

W. Bialek, Biophysics: Searching for Principles, Princeton University Press, 2012. https://press.princeton.edu/books/hardcover/9780691154561/biophysics

T. Mori, “Influential Force: From Higgs to the Novel Immune Checkpoint KYNU,” Jxiv preprint, 2022. https://doi.org/10.51094/jxiv.156

T. Mori, “Mutual Information Gradient Induces Contraction of Information Radius,” Jxiv preprint, submitted 2025.

Downloads

Posted


Submitted: 2025-11-27 12:27:51 UTC

Published: 2025-12-04 01:46:30 UTC
Section
Physics