Sparse recovery via nonconvex regularized M-estimators over _q-balls
From MaRDI portal
Publication:830557
DOI10.1016/J.CSDA.2020.107047OpenAlexW3041493066MaRDI QIDQ830557FDOQ830557
Jen-Chih Yao, Xin Li, J. H. Wang, C. Li, Dongya Wu
Publication date: 7 May 2021
Published in: Computational Statistics and Data Analysis (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1911.08061
Recommendations
- Support recovery without incoherence: a case for nonconvex regularization
- Regularized \(M\)-estimators with nonconvexity: statistical and algorithmic theory for local optima
- A general theory of concave regularization for high-dimensional sparse estimation problems
- Optimal computational and statistical rates of convergence for sparse nonconvex learning problems
- Statistical consistency and asymptotic normality for high-dimensional robust \(M\)-estimators
proximal gradient methodsparse recoveryconvergence ratestatistical consistencyrecovery boundnonconvex regularized \(M\)-estimators
Cites Work
- Nearly unbiased variable selection under minimax concave penalty
- Measurement error in Lasso: impact and likelihood bias correction
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Title not available (Why is that?)
- One-step sparse estimates in nonconcave penalized likelihood models
- Simultaneous analysis of Lasso and Dantzig selector
- Regularization and Variable Selection Via the Elastic Net
- Measurement Error in Nonlinear Models
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers
- High-dimensional regression with noisy and missing data: provable guarantees with nonconvexity
- Variable selection using MM algorithms
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- Minimax Rates of Estimation for High-Dimensional Linear Regression Over $\ell_q$-Balls
- Guaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear Norm Minimization
- Stable signal recovery from incomplete and inaccurate measurements
- Combining different procedures for adaptive regression
- Optimal computational and statistical rates of convergence for sparse nonconvex learning problems
- A general theory of concave regularization for high-dimensional sparse estimation problems
- Rate minimaxity of the Lasso and Dantzig selector for the \(l_{q}\) loss in \(l_{r}\) balls
- Regularized M-estimators with nonconvexity: Statistical and algorithmic theory for local optima
- Adaptive Minimax Estimation over Sparse $\ell_q$-Hulls
- Fast global convergence of gradient methods for high-dimensional statistical recovery
- Estimating structured high-dimensional covariance and precision matrices: optimal rates and adaptive estimation
- Optimal adaptive estimation of linear functionals under sparsity
- Bayesian neural networks with confidence estimations applied to data mining.
- An efficient algorithm for sparse inverse covariance matrix estimation based on dual formulation
Cited In (7)
- Sparse recovery via nonconvex regularized $M$-estimators over $\ell_q$-balls
- SPOQ $\ell _p$-Over-$\ell _q$ Regularization for Sparse Signal Recovery Applied to Mass Spectrometry
- The slow, steady ascent of a hot solid sphere in a Newtonian fluid with strongly temperature-dependent viscosity
- Sparse recovery in probability via \(l_q\)-minimization with Weibull random matrices for \(0 < q\leq 1\)
- Low-rank matrix estimation via nonconvex optimization methods in multi-response errors-in-variables regression
- Adaptive Huber trace regression with low-rank matrix parameter via nonconvex regularization
- Sparse estimation in high-dimensional linear errors-in-variables regression via a covariate relaxation method
This page was built for publication: Sparse recovery via nonconvex regularized \(M\)-estimators over \(\ell_q\)-balls
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q830557)