Estimating Squared-Loss Mutual Information for Independent Component Analysis
From MaRDI portal
Publication:3614944
Recommendations
Cites work
- scientific article; zbMATH DE number 2107836 (Why is no real title available?)
- 10.1162/153244303768966085
- Blind separation of sources. I: An adaptive algorithm based on neuromimetic architecture
- Independent component analysis, a new concept?
- Sequential Fixed-Point ICA Based on Mutual Information Minimization
Cited in
(8)- Direct density-ratio estimation with dimensionality reduction via least-squares hetero-distributional subspace search
- Dimensionality reduction for density ratio estimation in high-dimensional spaces
- Necessary and sufficient conditions of proper estimators based on self density ratio for unnormalized statistical models
- Machine learning with squared-loss mutual information
- Least-squares independent component analysis
- Canonical dependency analysis based on squared-loss mutual information
- A unifying information-theoretic framework for independent component analysis
- Density-ratio matching under the Bregman divergence: a unified framework of density-ratio estimation
This page was built for publication: Estimating Squared-Loss Mutual Information for Independent Component Analysis
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3614944)