Canonical dependency analysis based on squared-loss mutual information
From MaRDI portal
Publication:1942697
DOI10.1016/j.neunet.2012.06.009zbMath1258.68115OpenAlexW2134127584WikidataQ47342166 ScholiaQ47342166MaRDI QIDQ1942697
Masayuki Karasuyama, Masashi Sugiyama
Publication date: 13 March 2013
Published in: Neural Networks (Search for Journal in Brave)
Full work available at URL: http://hdl.handle.net/2433/159940
Factor analysis and principal components; correspondence analysis (62H25) Learning and adaptive systems in artificial intelligence (68T05)
Related Items
Canonical kernel dimension reduction, Estimation of mutual information by the fuzzy histogram, Machine learning with squared-loss mutual information
Uses Software
Cites Work
- A Mathematical Theory of Communication
- Common nonstationary components of asset prices
- Nonparametric and semiparametric models.
- Canonical correlation analysis based on information theory
- SINBAD: A neocortical mechanism for discovering environmental variables and regularities hidden in sensory input
- Kernel dimension reduction in regression
- Robust canonical correlations: a comparative study
- 10.1162/153244302760185252
- 10.1162/153244303768966085
- Modern Multivariate Statistical Techniques
- Estimating Optimal Transformations for Multiple Regression and Correlation
- Numerical Optimization
- Canonical Correlation Analysis: An Overview with Application to Learning Methods
- Algorithmic Learning Theory
- Elements of Information Theory
- RELATIONS BETWEEN TWO SETS OF VARIATES
- Theory of Reproducing Kernels
- On Information and Sufficiency
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item