Canonical dependency analysis based on squared-loss mutual information
From MaRDI portal
Recommendations
- Estimating Squared-Loss Mutual Information for Independent Component Analysis
- Canonical correlation analysis based on information theory
- Machine learning with squared-loss mutual information
- Probing High-Order Dependencies With Information Theory
- Sufficient dimension reduction via squared-loss mutual information estimation
- Canonical ensembles for potentially incompatible dependency networks with applications to medical data
- Efficient estimation of the canonical dependence function
- Minimax Mutual Information Approach for Independent Component Analysis
- Complexity-Regularized Tree-Structured Partition for Mutual Information Estimation
Cites work
- scientific article; zbMATH DE number 472970 (Why is no real title available?)
- scientific article; zbMATH DE number 1164152 (Why is no real title available?)
- scientific article; zbMATH DE number 4001209 (Why is no real title available?)
- scientific article; zbMATH DE number 1843060 (Why is no real title available?)
- scientific article; zbMATH DE number 1849138 (Why is no real title available?)
- scientific article; zbMATH DE number 837911 (Why is no real title available?)
- scientific article; zbMATH DE number 3241743 (Why is no real title available?)
- scientific article; zbMATH DE number 3252891 (Why is no real title available?)
- scientific article; zbMATH DE number 3257925 (Why is no real title available?)
- scientific article; zbMATH DE number 3322635 (Why is no real title available?)
- scientific article; zbMATH DE number 961607 (Why is no real title available?)
- scientific article; zbMATH DE number 3028933 (Why is no real title available?)
- 10.1162/153244303768966085
- A Mathematical Theory of Communication
- A least-squares approach to direct importance estimation
- Algorithmic Learning Theory
- Canonical Correlation Analysis: An Overview with Application to Learning Methods
- Canonical correlation analysis based on information theory
- Common nonstationary components of asset prices
- Elements of Information Theory
- Estimating Optimal Transformations for Multiple Regression and Correlation
- Kernel dimension reduction in regression
- Modern Multivariate Statistical Techniques
- Nonparametric and semiparametric models.
- Numerical Optimization
- On Information and Sufficiency
- On the influence of the kernel on the consistency of support vector machines
- RELATIONS BETWEEN TWO SETS OF VARIATES
- Robust canonical correlations: a comparative study
- SINBAD: A neocortical mechanism for discovering environmental variables and regularities hidden in sensory input
- Theory of Reproducing Kernels
Cited in
(6)- Large correlation analysis
- Non-linear canonical correlation analysis using alpha-beta divergence
- Machine learning with squared-loss mutual information
- Canonical kernel dimension reduction
- Estimation of mutual information by the fuzzy histogram
- Canonical dependency analysis using a bias-corrected \(\chi^2\) statistics matrix
This page was built for publication: Canonical dependency analysis based on squared-loss mutual information
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1942697)