L1-norm-based principal component analysis with adaptive regularization
From MaRDI portal
Publication:2417836
DOI10.1016/j.patcog.2016.07.014zbMath1414.62216OpenAlexW2464293031MaRDI QIDQ2417836
Yong Wang, Zhongqun Wang, Gui-Fu Lu, Jian Zou
Publication date: 29 May 2019
Published in: Pattern Recognition (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.patcog.2016.07.014
Factor analysis and principal components; correspondence analysis (62H25) Pattern recognition, speech recognition (68T10)
Related Items (3)
Autoencoders reloaded ⋮ Principal component analysis based on nuclear norm minimization ⋮ F-norm distance metric based robust 2DPCA and face recognition
Cites Work
- 2DPCA with L1-norm for simultaneously robust and sparse modelling
- Improve robustness of sparse PCA by \(L_{1}\)-norm maximization
- An augmented Lagrangian approach for sparse principal component analysis
- Simultaneous analysis of Lasso and Dantzig selector
- A Singular Value Thresholding Algorithm for Matrix Completion
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: L1-norm-based principal component analysis with adaptive regularization