Estimating Total Correlation with Mutual Information Estimators

From MaRDI portal
Publication:6353370

arXiv2011.04794MaRDI QIDQ6353370FDOQ6353370


Authors: Ke Bai, Pengyu Cheng, Weituo Hao, Ricardo Henao, Lawrence Carin Edit this on Wikidata


Publication date: 9 November 2020

Abstract: Total correlation (TC) is a fundamental concept in information theory that measures statistical dependency among multiple random variables. Recently, TC has shown noticeable effectiveness as a regularizer in many learning tasks, where the correlation among multiple latent embeddings requires to be jointly minimized or maximized. However, calculating precise TC values is challenging, especially when the closed-form distributions of embedding variables are unknown. In this paper, we introduce a unified framework to estimate total correlation values with sample-based mutual information (MI) estimators. More specifically, we discover a relation between TC and MI and propose two types of calculation paths (tree-like and line-like) to decompose TC into MI terms. With each MI term being bounded, the TC values can be successfully estimated. Further, we provide theoretical analyses concerning the statistical consistency of the proposed TC estimators. Experiments are presented on both synthetic and real-world scenarios, where our estimators demonstrate effectiveness in all TC estimation, minimization, and maximization tasks. The code is available at https://github.com/Linear95/TC-estimation.




Has companion code repository: https://github.com/linear95/tc-estimation









This page was built for publication: Estimating Total Correlation with Mutual Information Estimators

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6353370)