Sufficient Dimension Reduction via Squared-Loss Mutual Information Estimation
From MaRDI portal
Publication:5327187
DOI10.1162/NECO_a_00407zbMath1269.62054WikidataQ50767941 ScholiaQ50767941MaRDI QIDQ5327187
Masashi Sugiyama, Taiji Suzuki
Publication date: 7 August 2013
Published in: Neural Computation (Search for Journal in Brave)
Related Items (16)
A brief review of linear sufficient dimension reduction through optimization ⋮ Manifold Optimization-Assisted Gaussian Variational Approximation ⋮ Canonical kernel dimension reduction ⋮ Necessary and sufficient conditions of proper estimators based on self density ratio for unnormalized statistical models ⋮ Semi-supervised information-maximization clustering ⋮ Model-based policy gradients with parameter-based exploration by least-squares conditional density estimation ⋮ Density-Difference Estimation ⋮ Conditional Density Estimation with Dimensionality Reduction via Squared-Loss Conditional Entropy Minimization ⋮ Direct Estimation of the Derivative of Quadratic Mutual Information with Application in Supervised Dimension Reduction ⋮ Wasserstein discriminant analysis ⋮ Least-squares independence regression for non-linear causal inference under non-Gaussian noise ⋮ Supervised dimensionality reduction via distance correlation maximization ⋮ Sufficient Dimension Reduction via Direct Estimation of the Gradients of Logarithmic Conditional Densities ⋮ Dimensionality reduction for density ratio estimation in high-dimensional spaces ⋮ Model-based reinforcement learning with dimension reduction ⋮ Information-Theoretic Representation Learning for Positive-Unlabeled Classification
Cites Work
- Unnamed Item
- A constructive approach to the estimation of dimension reduction directions
- Fast rates for support vector machines using Gaussian kernels
- Rates of convergence for minimum contrast estimators
- Dual representation of \(\phi\)-divergences and applications.
- Sufficient dimension reduction and graphics in regression
- Kernel dimension reduction in regression
- 10.1162/153244302760185252
- Functional Principal Component Regression and Functional Partial Least Squares
- Estimating Optimal Transformations for Multiple Regression and Correlation
- Sliced Inverse Regression for Dimension Reduction
- Asymptotic Statistics
- The Geometry of Algorithms with Orthogonality Constraints
- Extending Sliced Inverse Regression
- Save: a method for dimension reduction and graphics in regression
- Estimation of the information by an adaptive partitioning of the observation space
- 10.1162/153244302760200687
- 10.1162/153244303322753742
- Estimating Divergence Functionals and the Likelihood Ratio by Convex Risk Minimization
- Probability Inequalities for Sums of Bounded Random Variables
- Elements of Information Theory
- Edgeworth Approximation of Multivariate Differential Entropy
- Sufficient Dimension Reduction via Inverse Regression
- On Sliced Inverse Regression With High-Dimensional Covariates
- RELATIONS BETWEEN TWO SETS OF VARIATES
- A Class of Statistics with Asymptotically Normal Distribution
- Theory of Reproducing Kernels
- On Information and Sufficiency
- A new look at the statistical model identification
This page was built for publication: Sufficient Dimension Reduction via Squared-Loss Mutual Information Estimation