On the entropy of couplings
From MaRDI portal
Publication:2346421
DOI10.1016/j.ic.2015.04.003zbMath1315.94023arXiv1303.3235OpenAlexW2112883966MaRDI QIDQ2346421
Vojin Šenk, Mladen Kovačević, Ivan Stanojević
Publication date: 1 June 2015
Published in: Information and Computation (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1303.3235
partitioncouplingcontingency tableinformation measuresubset sumentropy minimizationdistribution with fixed marginalsmeasure of dependenceentropy metricmaximization of information divergence
Combinatorial optimization (90C27) Measures of information, entropy (94A17) Computational difficulty of problems (lower bounds, completeness, difficulty of approximation, etc.) (68Q17) Distribution theory (60E99)
Related Items
Playing games with bounded entropy ⋮ Observational nonidentifiability, generalized likelihood and free energy ⋮ Heapability, Interactive Particle Systems, Partial Orders: Results and Open Problems ⋮ Empirical Regularized Optimal Transport: Statistical Theory and Applications ⋮ Information-geometric equivalence of transportation polytopes
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Mathematical Theory of Communication
- A note on the complexity of \(L _{p }\) minimization
- Minimum entropy combinatorial optimization problems
- Computational complexity of norm-maximization
- Normalized information-based divergences
- Pattern recognition as a quest for minimum entropy
- On the continuity of correspondences on sets of measures with restricted marginals
- Basic concepts, identities and inequalities -- the toolkit of information theory
- Maximum entropy fundamentals
- An information-geometric approach to a theory of pragmatic structuring
- Entropy and Information Theory
- On the Complexity of Nash Equilibria and Other Fixed Points
- Information Theory and Statistical Mechanics
- On measures of dependence
- A Variable-Complexity Norm Maximization Problem
- A First Course in Optimization Theory
- The MinMax information measure
- Information distance
- On the Discontinuity of the Shannon Information Measures
- Finding the Maximizers of the Information Divergence From an Exponential Family
- The Interplay Between Entropy and Variational Distance
- On the Interplay Between Conditional Entropy and Error Probability
- Entropy Bounds for Discrete Random Variables via Maximal Coupling
- A Metric Between Probability Distributions on Finite Sets of Different Cardinalities and Applications to Order Reduction
- Elements of Information Theory
- On the Kunneth Formula and Functorial Dependence in Algebraic Topology
- Mutual Information and Maximal Correlation as Measures of Dependence
- Information Theory and Statistics: A Tutorial