プレプリント / バージョン1

Minimal-Action Information Geometry: Unified Information Distance Dinfo

##article.authors##

  • Mori, Tsutomu Department of Human Lifesciences, Fukushima Medical University School of Nursing

DOI:

https://doi.org/10.51094/jxiv.2073

キーワード:

information distance、 mutual information、 influential force、 repulsive interactions、 information geometry、 natural forces

抄録

The unification of internal (gauge) and external (spacetime) coordinates remains a central challenge in unifying fundamental interactions. Several geometric approaches, including quantum Riemannian metrics, information-geometric distances, and Wasserstein transport, have been explored. However, no single metric spans classical probability distributions, quantum states, and heterogeneous information carriers. Therefore, we sought a minimal-action geometric framework capable of unifying these regimes within a single state space. Starting from an abstract, information-theoretic action constrained by locality, additivity, reparametrization invariance, joint convexity, and contractivity under admissible channels, we demonstrate that the action necessarily assumes a quadratic-norm form, inducing a parameterization-independent geodesic distance. Requiring a faithful embedding into a real Hilbert space identifies Jensen-Shannon-type divergences as the canonical metrics compatible with these axioms. The resulting unified information distance, Dinfo, recovers the square root of the Jensen-Shannon divergence for classical distributions and its quantum analogue for density operators. It also extends naturally to heterogeneous and hybrid systems. This minimal-action construction establishes Dinfo as a derived metric rather than a postulated one. Consequently, Dinfo provides a rigorous, coordinate-independent foundation for analyzing entropy-driven interactions, information contraction, and multiscale informational structures spanning physical, biological, and computational contexts.

利益相反に関する開示

The authors declare no potential conflict of interests.

ダウンロード *前日までの集計結果を表示します

ダウンロード実績データは、公開の翌日以降に作成されます。

引用文献

Amari S, Nagaoka H. Methods of Information Geometry. American Mathematical Society; 2000.

Villani C. Optimal Transport: Old and New. Springer; 2009.

Bures D. An extension of Kakutani’s theorem on infinite product measures. Transactions of the American Mathematical Society. 1969;135:199–212.

Lin J. Divergence measures based on the Shannon entropy. IEEE Transactions on Information Theory. 1991;37(1):145–151.

Endres DM, Schindelin JE. A new metric for probability distributions. IEEE Transactions on Information Theory. 2003;49(7):1858–1860.

Lamberti PW, Majtey AP, Borrás A, Casas M, Plastino A. Metric character of the quantum Jensen–Shannon divergence. Physical Review A. 2008;77:052311.

Virosztek D. Quantum Wasserstein metrics and related geometric inequalities. Communications in Mathematical Physics. 2021;379:995–1034.

Cover TM, Thomas JA. Elements of Information Theory. 2nd ed. Wiley; 2006.

Shannon CE. A mathematical theory of communication. Bell System Technical Journal. 1948;27:379–423, 623–656.

Majtey AP, Lamberti PW, Prato DP. Jensen–Shannon divergence as a measure of distinguishability between mixed quantum states. Physical Review A. 2005;72:052310.

Fuglede B, Topsøe F. Jensen–Shannon divergence and Hilbert space embedding. In: Proceedings of the IEEE International Symposium on Information Theory (ISIT). 2004.

Schoenberg IJ. Metric spaces and positive definite functions. Transactions of the American Mathematical Society. 1938;44:522–536.

Burbea J, Rao CR. On the convexity of some divergence measures based on entropy functions. IEEE Transactions on Information Theory. 1982;28(3):489–495.

Lindblad G. Completely positive maps and entropy inequalities. Communications in Mathematical Physics. 1975;40(2):147–151.

Petz D. Sufficient subalgebras and the relative entropy of states of a von Neumann algebra. Communications in Mathematical Physics. 1986;105(1):123–131.

Khinchin AI. Mathematical Foundations of Information Theory. Dover; 1957.

Topsøe F. Some inequalities for information divergence and related measures of discrimination. IEEE Transactions on Information Theory. 2000;46(4):1602–1609.

Nielsen F. On the Jensen–Shannon symmetrization of distances relying on abstract means. Entropy. 2020;22(2):221.

MacKay DJC. Information Theory, Inference, and Learning Algorithms. Cambridge University Press; 2003.

Mori T. Mutual Information Gradient Induces Contraction of Information Radius. Jxiv Preprint; 2025.

Nielsen MA, Chuang IL. Quantum Computation and Quantum Information. Cambridge University Press; 2000.

Majtey AP, Lamberti PW, Prato DP. Jensen–Shannon divergence as a measure of distinguishability between mixed quantum states. Physical Review A. 2010;82:052310.

Weinberg S. Gravitation and Cosmology. Wiley; 1972.

Misner CW, Thorne KS, Wheeler JA. Gravitation. W. H. Freeman; 1973.

Vedral V. The role of relative entropy in quantum information theory. Reviews of Modern Physics. 2002;74(1):197–234.

Ackerman N, Freer C, Miller RG. On information distance metrics in algorithmic information theory. Preprint; 2021.

Parrondo JMR, Horowitz JM, Sagawa T. Thermodynamics of information. Nature Physics. 2015;11:131–139.

ダウンロード

公開済


投稿日時: 2025-12-04 09:19:37 UTC

公開日時: 2025-12-15 01:29:58 UTC
研究分野
物理学