Optimal computational and statistical rates of convergence for sparse nonconvex learning problems
From MaRDI portal
Publication:482875
DOI10.1214/14-AOS1238zbMath1302.62066arXiv1306.4960OpenAlexW3103820806WikidataQ43079370 ScholiaQ43079370MaRDI QIDQ482875
Zhaoran Wang, Tong Zhang, Han Liu
Publication date: 6 January 2015
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1306.4960
path-following methodgeometric computational ratenonconvex regularized \(M\)-estimationoptimal statistical rate
Parametric inference under constraints (62F30) Generalized linear models (logistic models) (62J12) Nonconvex programming, global optimization (90C26) Methods of reduced gradient type (90C52)
Related Items
Global solutions to folded concave penalized nonconvex learning, Sparse recovery via nonconvex regularized \(M\)-estimators over \(\ell_q\)-balls, GSDAR: a fast Newton algorithm for \(\ell_0\) regularized generalized linear models with statistical guarantee, High-Dimensional Learning Under Approximate Sparsity with Applications to Nonsmooth Estimation and Regularized Neural Networks, Hard Thresholding Regularised Logistic Regression: Theory and Algorithms, Point source super-resolution via non-convex \(L_1\) based methods, Distributed testing and estimation under sparse high dimensional models, Bias versus non-convexity in compressed sensing, The Spike-and-Slab LASSO, A unifying framework of high-dimensional sparse estimation with difference-of-convex (DC) regularizations, Nonconvex regularization for sparse neural networks, Penalised robust estimators for sparse and high-dimensional linear models, Folded concave penalized sparse linear regression: sparsity, statistical performance, and algorithmic theory for local solutions, Penalized Estimation of Frailty-Based Illness–Death Models for Semi-Competing Risks, Penalized wavelet nonparametric univariate logistic regression for irregular spaced data, Accelerate the warm-up stage in the Lasso computation via a homotopic approach, Dynamic behavior analysis via structured rank minimization, Unnamed Item, Misspecified nonconvex statistical optimization for sparse phase retrieval, Covariate-Assisted Sparse Tensor Completion, Model selection in high-dimensional quantile regression with seamless \(L_0\) penalty, Unnamed Item, A primal and dual active set algorithm for truncated \(L_1\) regularized logistic regression, An unbiased approach to compressed sensing, Optimal computational and statistical rates of convergence for sparse nonconvex learning problems, Pathwise coordinate optimization for sparse learning: algorithm and theory, I-LAMM for sparse learning: simultaneous control of algorithmic complexity and statistical error, Unnamed Item, Sorted concave penalized regression, Minimum average variance estimation with group Lasso for the multivariate response central mean subspace, A unified primal dual active set algorithm for nonconvex sparse recovery, Optimal sparsity testing in linear regression model, Fully Bayesian logistic regression with hyper-LASSO priors for high-dimensional feature selection, Computational and statistical analyses for robust non-convex sparse regularized regression problem, Analysis of generalized Bregman surrogate algorithms for nonsmooth nonconvex statistical learning, Unnamed Item, A theoretical understanding of self-paced learning, Sample average approximation with sparsity-inducing penalty for high-dimensional stochastic programming, Optimality condition and complexity analysis for linearly-constrained optimization without differentiability on the boundary, Nonregular and minimax estimation of individualized thresholds in high dimension with binary responses
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Coordinate descent algorithms for nonconvex penalized regression, with applications to biological feature selection
- Nearly unbiased variable selection under minimax concave penalty
- Gradient methods for minimizing composite functions
- Optimal computational and statistical rates of convergence for sparse nonconvex learning problems
- Iterative hard thresholding for compressed sensing
- Fast global convergence of gradient methods for high-dimensional statistical recovery
- Some sharp performance bounds for least squares regression with \(L_1\) regularization
- Sparsity in penalized empirical risk minimization
- One-step sparse estimates in nonconcave penalized likelihood models
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- Introductory lectures on convex optimization. A basic course.
- Least angle regression. (With discussion)
- An iterative algorithm for fitting nonconvex penalized generalized linear models with grouped predictors
- Sparse permutation invariant covariance estimation
- Thresholding-based iterative selection procedures for model selection and shrinkage
- Simultaneous analysis of Lasso and Dantzig selector
- High-dimensional generalized linear models and the lasso
- Multi-stage convex relaxation for feature selection
- Calibrating nonconvex penalized regression in ultra-high dimension
- Structure estimation for discrete graphical models: generalized covariance matrices and their inverses
- Strong oracle optimality of folded concave penalized estimation
- Variable selection using MM algorithms
- Piecewise linear regularized solution paths
- A Proximal-Gradient Homotopy Method for the Sparse Least-Squares Problem
- SparseNet: Coordinate Descent With Nonconvex Penalties
- Decoding by Linear Programming
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Sparse Reconstruction by Separable Approximation
- Quantile Regression for Analyzing Heterogeneity in Ultra-High Dimension
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Minimax Rates of Estimation for High-Dimensional Linear Regression Over $\ell_q$-Balls
- Smoothly Clipped Absolute Deviation on High Dimensions
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers
- A general theory of concave regularization for high-dimensional sparse estimation problems