Kernel dimension reduction in regression
From MaRDI portal
Publication:2388983
DOI10.1214/08-AOS637zbMath1168.62049arXiv0908.1854OpenAlexW2072460512MaRDI QIDQ2388983
Kenji Fukumizu, Michael I. Jordan, Francis Bach
Publication date: 22 July 2009
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/0908.1854
Multivariate analysis (62H99) Characterization and structure theory for multivariate probability distributions; copulas (62H05) Linear inference, regression (62J99) General nonlinear regression (62J02) Applications of functional analysis in probability theory and statistics (46N30)
Related Items
A selective overview of sparse sufficient dimension reduction, Entropy-based sliced inverse regression, A Rigorous Theory of Conditional Mean Embeddings, Testing the Linear Mean and Constant Variance Conditions in Sufficient Dimension Reduction, Sufficient dimension reduction via principal L\(q\) support vector machine, A brief review of linear sufficient dimension reduction through optimization, Sliced Inverse Regression in Metric Spaces, Rejoinder on: ``Probability enhanced effective dimension reduction for classifying sparse functional data, Functional sufficient dimension reduction through average Fréchet derivatives, High-dimensional sufficient dimension reduction through principal projections, Minimal \(\sigma\)-field for flexible sufficient dimension reduction, Nonlinear predictive directions in clinical trials, Discriminant Dynamic Mode Decomposition for Labeled Spatiotemporal Data Collections, Student sliced inverse regression, Canonical kernel dimension reduction, Probability-enhanced sufficient dimension reduction for binary classification, Unnamed Item, Nonlinear and additive principal component analysis for functional data, Central subspaces review: methods and applications, Sufficient Dimension Reduction via Squared-Loss Mutual Information Estimation, Sparse SIR: optimal rates and adaptive estimation, Predictive power of principal components for single-index model and sufficient dimension reduction, A new reproducing kernel‐based nonlinear dimension reduction method for survival data, A structured covariance ensemble for sufficient dimension reduction, An Outer-Product-of-Gradient Approach to Dimension Reduction and its Application to Classification in High Dimensional Space, Supervised distance preserving projection using alternating direction method of multipliers, Conditional Functional Graphical Models, Dimensionality reduction by mixed kernel canonical correlation analysis, Tail inverse regression: dimension reduction for prediction of extremes, An Additive Graphical Model for Discrete Data, Canonical dependency analysis based on squared-loss mutual information, Multivariate tests of independence based on a new class of measures of independence in reproducing kernel Hilbert space, Minimax risks for sparse regressions: ultra-high dimensional phenomenons, Learning sparse gradients for variable selection and dimension reduction, A general theory for nonlinear sufficient dimension reduction: formulation and estimation, Principal support vector machines for linear and nonlinear sufficient dimension reduction, Sufficient dimension reduction based on an ensemble of minimum average variance estimators, A study on imbalance support vector machine algorithms for sufficient dimension reduction, Dimension reduction and its application to model-based exploration in continuous spaces, Conditional Density Estimation with Dimensionality Reduction via Squared-Loss Conditional Entropy Minimization, Direct Estimation of the Derivative of Quadratic Mutual Information with Application in Supervised Dimension Reduction, An adaptive composite quantile approach to dimension reduction, Sufficient dimension reduction with simultaneous estimation of effective dimensions for time-to-event data, Information-Theoretic Semi-Supervised Metric Learning via Entropy Regularization, Minimax adaptive dimension reduction for regression, Dealing with big data: comparing dimension reduction and shrinkage regression methods, Least-squares independence regression for non-linear causal inference under non-Gaussian noise, Sufficient dimension reduction and prediction in regression, Dimensionality reduction: an interpretation from manifold regularization perspective, Sufficient Dimension Reduction via Direct Estimation of the Gradients of Logarithmic Conditional Densities, Statistical performance of optimal scoring in reproducing kernel Hilbert spaces, Slice inverse regression with score functions, Characteristic and Universal Tensor Product Kernels, A class of optimal estimators for the covariance operator in reproducing kernel Hilbert spaces, Sparse Estimation of Conditional Graphical Models With Application to Gene Networks, Kernel sliced inverse regression: regularization and consistency, A Cost Based Reweighted Scheme of Principal Support Vector Machine, A Conditional Entropy Minimization Criterion for Dimensionality Reduction and Multiple Kernel Learning, Subspace perspective on canonical correlation analysis: dimension reduction and minimax rates, Estimating sufficient reductions of the predictors in abundant high-dimensional regressions, Model-based reinforcement learning with dimension reduction, On the predictive potential of kernel principal components, Computing functions of random variables via reproducing kernel Hilbert space representations, Comments on: ``Probability enhanced effective dimension reduction for classifying sparse functional data, Least-Squares Independent Component Analysis, A Shrinkage Estimation of Central Subspace in Sufficient Dimension Reduction, Comment, Machine learning with squared-loss mutual information, Isometric sliced inverse regression for nonlinear manifold learning, Gradient-Based Kernel Dimension Reduction for Regression, On an Additive Semigraphoid Model for Statistical Networks With Application to Pathway Analysis, Discussion of: Brownian distance covariance, Admissible kernels for RKHS embedding of probability distributions, Sliced inverse regression method for multivariate compositional data modeling, A Nonparametric Graphical Model for Functional Data With Application to Brain Networks Based on fMRI, A Generalized Kernel Method for Global Sensitivity Analysis
Cites Work
- Structure adaptive approach for dimension reduction.
- Dimension reduction for conditional mean in regression
- Sufficient dimension reduction and graphics in regression
- Moment-based dimension reduction for multivariate response regression
- Contour regression: a general approach to dimension reduction
- Hilbert space embeddings and metrics on probability measures
- 10.1162/153244303768966085
- Exploring Regression Structure Using Nonparametric Functional Estimation
- Fourier Methods for Estimating the Central Subspace and the Central Mean Subspace in Regression
- Estimating Optimal Transformations for Multiple Regression and Correlation
- Sliced Inverse Regression for Dimension Reduction
- On Principal Hessian Directions for Data Visualization and Dimension Reduction: Another Application of Stein's Lemma
- Joint Measures and Cross-Covariance Operators
- Asymptotic Statistics
- Theory & Methods: Special Invited Paper: Dimension Reduction and Visualization in Discriminant Analysis (with discussion)
- Dimension Reduction in Binary Response Regression
- An Adaptive Estimation of Dimension Reduction Space
- Algorithmic Learning Theory
- Direction estimation in single-index regressions
- Theory of Reproducing Kernels
- Convergence of stochastic processes
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item