Conditional Density Estimation with Dimensionality Reduction via Squared-Loss Conditional Entropy Minimization
From MaRDI portal
Publication:5380192
DOI10.1162/NECO_a_00683zbMath1473.62126arXiv1404.6876OpenAlexW2104320806WikidataQ46429423 ScholiaQ46429423MaRDI QIDQ5380192
Masashi Sugiyama, Voot Tangkaratt, Ning Xie
Publication date: 4 June 2019
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1404.6876
Density estimation (62G07) Computer graphics; computational geometry (digital and algorithmic aspects) (68U05) Artificial intelligence for robotics (68T40) Mathematics and visual arts (00A66)
Related Items
Direct Estimation of the Derivative of Quadratic Mutual Information with Application in Supervised Dimension Reduction, Sufficient Dimension Reduction via Direct Estimation of the Gradients of Logarithmic Conditional Densities, Cross-Domain Metric and Multiple Kernel Learning Based on Information Theory, Model-based reinforcement learning with dimension reduction
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A constructive approach to the estimation of dimension reduction directions
- Sliced Regression for Dimension Reduction
- Dual representation of \(\phi\)-divergences and applications.
- Density-ratio matching under the Bregman divergence: a unified framework of density-ratio estimation
- Kernel dimension reduction in regression
- Sufficient Dimension Reduction via Bayesian Mixture Modeling
- Sliced Inverse Regression for Dimension Reduction
- Robust and efficient estimation by minimising a density power divergence
- The Geometry of Algorithms with Orthogonality Constraints
- Estimating Divergence Functionals and the Likelihood Ratio by Convex Risk Minimization
- Sufficient Dimension Reduction via Squared-Loss Mutual Information Estimation
- Sufficient Dimension Reduction via Inverse Regression
- On Information and Sufficiency
- Robust Statistics