Machine learning with squared-loss mutual information
DOI10.3390/e15010080zbMath1371.68241OpenAlexW2008901952MaRDI QIDQ742658
Publication date: 19 September 2014
Published in: Entropy (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.3390/e15010080
clusteringindependent component analysismachine learningdimensionality reductioncausal inferencesquared-loss mutual informationdensity-ratio estimationindependence testingobject matchingPearson divergence
Factor analysis and principal components; correspondence analysis (62H25) Classification and discrimination; cluster analysis (statistical aspects) (62H30) Learning and adaptive systems in artificial intelligence (68T05) Statistical aspects of information-theoretic topics (62B10)
Related Items
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Direct importance estimation for covariate shift adaptation
- Statistical analysis of kernel-based least-squares density-ratio estimation
- Direct density-ratio estimation with dimensionality reduction via least-squares hetero-distributional subspace search
- Blind separation of sources. I: An adaptive algorithm based on neuromimetic architecture
- Dimensionality reduction for density ratio estimation in high-dimensional spaces
- Least angle regression. (With discussion)
- Nonparametric and semiparametric models.
- Canonical correlation analysis based on information theory
- Weak convergence and empirical processes. With applications to statistics
- Density-ratio matching under the Bregman divergence: a unified framework of density-ratio estimation
- Canonical dependency analysis based on squared-loss mutual information
- Least-squares two-sample test
- Kernel dimension reduction in regression
- 10.1162/153244302760185252
- 10.1162/153244303768966085
- Least-Squares Independent Component Analysis
- Divergence Estimation of Continuous Distributions Based on Data-Dependent Partitions
- Independent coordinates for strange attractors from mutual information
- The estimation of the gradient of a density function, with applications in pattern recognition
- Robust and efficient estimation by minimising a density power divergence
- Asymptotic Statistics
- The Geometry of Algorithms with Orthogonality Constraints
- Save: a method for dimension reduction and graphics in regression
- Estimation of the information by an adaptive partitioning of the observation space
- 10.1162/153244303322753616
- $f$-Divergence Estimation and Two-Sample Homogeneity Test Under Semiparametric Density-Ratio Models
- Estimating Divergence Functionals and the Likelihood Ratio by Convex Risk Minimization
- Sequential Fixed-Point ICA Based on Mutual Information Minimization
- Edgeworth Approximation of Multivariate Differential Entropy
- RELATIONS BETWEEN TWO SETS OF VARIATES
- On Information and Sufficiency