On the finite-sample analysis of \(\Theta\)-estimators
From MaRDI portal
Publication:5899659
DOI10.1214/15-EJS1100zbMath1329.62328MaRDI QIDQ5899659
Publication date: 21 January 2016
Published in: Electronic Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://projecteuclid.org/euclid.ejs/1453298265
Ridge regression; shrinkage estimators (Lasso) (62J07) Nonconvex programming, global optimization (90C26) Probability in computer science (algorithm analysis, random structures, phase transitions, etc.) (68Q87)
Related Items (1)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Nearly unbiased variable selection under minimax concave penalty
- Oracle inequalities and optimal inference under group sparsity
- Nuclear-norm penalization and optimal rates for noisy low-rank matrix completion
- How close is the sample covariance matrix to the actual covariance matrix?
- Fast global convergence of gradient methods for high-dimensional statistical recovery
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- An iterative algorithm for fitting nonconvex penalized generalized linear models with grouped predictors
- Thresholding-based iterative selection procedures for model selection and shrinkage
- On the conditions used to prove oracle results for the Lasso
- Simultaneous analysis of Lasso and Dantzig selector
- Sparsity oracle inequalities for the Lasso
- Ideal spatial adaptation by wavelet shrinkage
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- The Generic Chaining
- Regularization and Variable Selection Via the Elastic Net
- Regularized M-estimators with nonconvexity: Statistical and algorithmic theory for local optima
- A general theory of concave regularization for high-dimensional sparse estimation problems
This page was built for publication: On the finite-sample analysis of \(\Theta\)-estimators