Direct Estimation of the Derivative of Quadratic Mutual Information with Application in Supervised Dimension Reduction
From MaRDI portal
Publication:5380831
DOI10.1162/neco_a_00986zbMath1461.62081arXiv1508.01019OpenAlexW2963815578WikidataQ38733935 ScholiaQ38733935MaRDI QIDQ5380831
Hiroaki Sasaki, Masashi Sugiyama, Voot Tangkaratt
Publication date: 6 June 2019
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1508.01019
Measures of association (correlation, canonical correlation, etc.) (62H20) Statistical aspects of information-theoretic topics (62B10)
Related Items (max. 100)
Wasserstein discriminant analysis ⋮ Sufficient Dimension Reduction via Direct Estimation of the Gradients of Logarithmic Conditional Densities
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A constructive approach to the estimation of dimension reduction directions
- Derivative reproducing properties for kernel methods in learning theory
- Principal component analysis.
- Support-vector networks
- Kernel dimension reduction in regression
- Minimization of functions having Lipschitz continuous first partial derivatives
- Manopt, a Matlab toolbox for optimization on manifolds
- Dimension Reduction: A Guided Tour
- Estimation of conditional densities and sensitivity measures in nonlinear dynamical systems
- Sliced Inverse Regression for Dimension Reduction
- Robust and efficient estimation by minimising a density power divergence
- 10.1162/153244303322753742
- Gradient-Based Kernel Dimension Reduction for Regression
- Sufficient Dimension Reduction via Squared-Loss Mutual Information Estimation
- Density-Difference Estimation
- Conditional Density Estimation with Dimensionality Reduction via Squared-Loss Conditional Entropy Minimization
- Regularized Multitask Learning for Multidimensional Log-Density Gradient Estimation
- Theory of Reproducing Kernels
- On Information and Sufficiency
- Learning from examples with information theoretic criteria
This page was built for publication: Direct Estimation of the Derivative of Quadratic Mutual Information with Application in Supervised Dimension Reduction