Supervised dimensionality reduction via distance correlation maximization
From MaRDI portal
Abstract: In our work, we propose a novel formulation for supervised dimensionality reduction based on a nonlinear dependency criterion called Statistical Distance Correlation, Szekely et. al. (2007). We propose an objective which is free of distributional assumptions on regression variables and regression model assumptions. Our proposed formulation is based on learning a low-dimensional feature representation , which maximizes the squared sum of Distance Correlations between low dimensional features and response , and also between features and covariates . We propose a novel algorithm to optimize our proposed objective using the Generalized Minimization Maximizaiton method of Parizi et. al. (2015). We show superior empirical results on multiple datasets proving the effectiveness of our proposed approach over several relevant state-of-the-art supervised dimensionality reduction methods.
Recommendations
- scientific article; zbMATH DE number 5957198
- Sufficient dimension reduction via squared-loss mutual information estimation
- Dependence maps, a dimensionality reduction with dependence distance for high-dimensional data
- Supervised principal component analysis: visualization, classification and regression on subspaces and submanifolds
- Large correlation analysis
Cites work
- scientific article; zbMATH DE number 5957257 (Why is no real title available?)
- scientific article; zbMATH DE number 964896 (Why is no real title available?)
- Brownian distance covariance
- Constructive setting for problems of density ratio estimation
- Density ratio estimation in machine learning. Foreword by Thomas G. Dietterich
- Diagnostic studies in sufficient dimension reduction
- Feature screening via distance correlation learning
- Gradient-based kernel dimension reduction for regression
- Graphics for Regressions With a Binary Response
- Hedonic housing prices and the demand for clean air
- Likelihood-based sufficient dimension reduction
- Marginal tests with sliced average variance estimation
- Measuring and testing dependence by correlation of distances
- Minimization of ratios
- Modern multidimensional scaling. Theory and applications.
- Multidimensional scaling. I: Theory and method
- On Nonlinear Fractional Programming
- On convergence of minimization methods: Attraction, repulsion, and selection
- On the uniqueness of distance covariance
- Optimization
- Partial central subspace and sliced average variance estimation
- Sequential Minimax Search for a Maximum
- Sliced Inverse Regression for Dimension Reduction
- Sliced inverse regression for multivariate response regression
- Sufficient dimension reduction via squared-loss mutual information estimation
- The distance correlation \(t\)-test of independence in high dimension
- The sliced inverse regression algorithm as a maximum likelihood procedure
- Variable selection in functional data classification: a maxima-hunting proposal
Cited in
(15)- Convergence rates for kernel regression in infinite-dimensional spaces
- Dependence maps, a dimensionality reduction with dependence distance for high-dimensional data
- A dimensionality reduction method of continuous dependent variables based supervised Laplacian eigenmaps
- Asymptotic distributions of high-dimensional distance correlation inference
- Large correlation analysis
- Feature selection based on distance correlation: a filter algorithm
- scientific article; zbMATH DE number 7376763 (Why is no real title available?)
- Supervised dimension reduction of intrinsically low-dimensional data
- Diverse data selection via combinatorial quasi-concavity of distance covariance: a polynomial time global minimax algorithm
- Supervised distance preserving projection using alternating direction method of multipliers
- Dimensionality reduction using automatic supervision for vision-based terrain learning
- Direct estimation of the derivative of quadratic mutual information with application in supervised dimension reduction
- Unsupervised dimensionality reduction versus supervised regularization for classification from sparse data
- Supervised t-Distributed Stochastic Neighbor Embedding for Data Visualization and Classification
- Correlation-based multidimensional scaling for unsupervised subspace learning
This page was built for publication: Supervised dimensionality reduction via distance correlation maximization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1746548)