Machine learning with squared-loss mutual information (Q742658)

From MaRDI portal
scientific article
Language Label Description Also known as
English
Machine learning with squared-loss mutual information
scientific article

    Statements

    Machine learning with squared-loss mutual information (English)
    0 references
    0 references
    0 references
    0 references
    19 September 2014
    0 references
    Summary: Mutual information (MI) is useful for detecting statistical independence between random variables, and it has been successfully applied to solving various machine learning problems. Recently, an alternative to MI called \textit{squared-loss} MI (SMI) was introduced. While ordinary MI is the Kullback-Leibler divergence from the joint distribution to the product of the marginal distributions, SMI is its Pearson divergence variant. Because both the divergences belong to the \(f\)-divergence family, they share similar theoretical properties. However, a notable advantage of SMI is that it can be approximated from data in a computationally more efficient and numerically more stable way than ordinary MI. In this article, we review recent development in SMI approximation based on direct density-ratio estimation and SMI-based machine learning techniques such as independence testing, dimensionality reduction, canonical dependency analysis, independent component analysis, object matching, clustering, and causal inference.
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    squared-loss mutual information
    0 references
    Pearson divergence
    0 references
    density-ratio estimation
    0 references
    independence testing
    0 references
    dimensionality reduction
    0 references
    independent component analysis
    0 references
    object matching
    0 references
    clustering
    0 references
    causal inference
    0 references
    machine learning
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references