Manifold elastic net: a unified framework for sparse dimension reduction
From MaRDI portal
Abstract: It is difficult to find the optimal sparse solution of a manifold learning based dimensionality reduction algorithm. The lasso or the elastic net penalized manifold learning based dimensionality reduction is not directly a lasso penalized least square problem and thus the least angle regression (LARS) (Efron et al. cite{LARS}), one of the most popular algorithms in sparse learning, cannot be applied. Therefore, most current approaches take indirect ways or have strict settings, which can be inconvenient for applications. In this paper, we proposed the manifold elastic net or MEN for short. MEN incorporates the merits of both the manifold learning based dimensionality reduction and the sparse learning based dimensionality reduction. By using a series of equivalent transformations, we show MEN is equivalent to the lasso penalized least square problem and thus LARS is adopted to obtain the optimal sparse solution of MEN. In particular, MEN has the following advantages for subsequent classification: 1) the local geometry of samples is well preserved for low dimensional data representation, 2) both the margin maximization and the classification error minimization are considered for sparse projection calculation, 3) the projection matrix of MEN improves the parsimony in computation, 4) the elastic net penalty reduces the over-fitting problem, and 5) the projection matrix of MEN can be interpreted psychologically and physiologically. Experimental evidence on face recognition over various popular datasets suggests that MEN is superior to top level dimensionality reduction algorithms.
Recommendations
- A metric multidimensional scaling-based nonlinear manifold learning approach for unsupervised data reduction
- Regression analysis of locality preserving projections via sparse penalty
- Dimensionality reduction with extreme learning machine based on sparsity and neighborhood preserving
- A supervised non-linear dimensionality reduction approach for manifold learning
- Dimensionality reduction: an interpretation from manifold regularization perspective
Cites work
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- A Direct Formulation for Sparse PCA Using Semidefinite Programming
- A unified approach to model selection and sparse recovery using regularized least squares
- DASSO: Connections Between the Dantzig Selector and Lasso
- Enhancing sparsity by reweighted \(\ell _{1}\) minimization
- Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data
- Least angle regression. (With discussion)
- Manifold regularization: a geometric framework for learning from labeled and unlabeled examples
- On the adaptive elastic net with a diverging number of parameters
- Principal Manifolds and Nonlinear Dimensionality Reduction via Tangent Space Alignment
- Regularization and Variable Selection Via the Elastic Net
- The Adaptive Lasso and Its Oracle Properties
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
Cited in
(13)- Similarity preserving low-rank representation for enhanced data representation and effective subspace learning
- Finding the best not the most: regularized loss minimization subgraph selection for graph classification
- A probabilistic model for image representation via multiple patterns
- Information retrieval approach to meta-visualization
- Principal manifold learning by sparse grids
- Dimensionality reduction with extreme learning machine based on sparsity and neighborhood preserving
- Dimensionality reduction by mixed kernel canonical correlation analysis
- Dimensionality reduction: an interpretation from manifold regularization perspective
- Double linear regressions for single labeled image per person face recognition
- Primal explicit max margin feature selection for nonlinear support vector machines
- Compressed labeling on distilled labelsets for multi-label learning
- 2DPCA with L1-norm for simultaneously robust and sparse modelling
- A Lagrange-Newton algorithm for sparse nonlinear programming
This page was built for publication: Manifold elastic net: a unified framework for sparse dimension reduction
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q408616)