Sparse and robust estimation with ridge minimax concave penalty
From MaRDI portal
Publication:6092060
DOI10.1016/j.ins.2021.04.047OpenAlexW3153947822MaRDI QIDQ6092060
Yao Dong, He Jiang, Weihua Zheng
Publication date: 23 November 2023
Published in: Information Sciences (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.ins.2021.04.047
Cites Work
- Unnamed Item
- Nearly unbiased variable selection under minimax concave penalty
- The Adaptive Lasso and Its Oracle Properties
- Weighted LAD-LASSO method for robust parameter estimation and variable selection in regression
- Robust and sparse bridge regression
- Robust penalized quantile regression estimation for panel data
- Pathwise coordinate optimization for sparse learning: algorithm and theory
- Robust regression: Asymptotics, conjectures and Monte Carlo
- Robust regression through the Huber's criterion and adaptive lasso penalty
- Maximum correntropy adaptation approach for robust compressive sensing reconstruction
- Generalized \(\ell_1\)-penalized quantile regression with linear constraints
- Simultaneous feature selection and clustering based on square root optimization
- Correntropy-based metric for robust twin support vector machine
- Regularized quantile regression for ultrahigh-dimensional data with nonignorable missing responses
- Semi-supervised sparse feature selection via graph Laplacian based scatter matrix for regression problems
- Robust stochastic configuration networks with maximum correntropy criterion for uncertain data regression
- Robust image compressive sensing based on half-quadratic function and weighted Schatten-\(p\) norm
- Simultaneous analysis of Lasso and Dantzig selector
- \(\ell_1\)-penalized quantile regression in high-dimensional sparse models
- Adaptive robust variable selection
- The adaptive L1-penalized LAD regression for partially linear single-index models
- A Frisch-Newton algorithm for sparse quantile regression
- Least Median of Squares Regression
- SCAD-Penalized Least Absolute Deviation Regression in High-Dimensional Models
- A Robust Method for Multiple Linear Regression
- Regression Quantiles
- Alternatives to the Median Absolute Deviation
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Sparse Regularization via Convex Analysis
- A Statistical View of Some Chemometrics Regression Tools
- Quantile Regression for Analyzing Heterogeneity in Ultra-High Dimension
- Regularization and Variable Selection Via the Elastic Net
- Asymptotic properties for combined L1 and concave regularization
- Ridge Regression: Biased Estimation for Nonorthogonal Problems
- A general theory of concave regularization for high-dimensional sparse estimation problems
This page was built for publication: Sparse and robust estimation with ridge minimax concave penalty