Optimal computational and statistical rates of convergence for sparse nonconvex learning problems

From MaRDI portal
Publication:482875

DOI10.1214/14-AOS1238zbMath1302.62066arXiv1306.4960OpenAlexW3103820806WikidataQ43079370 ScholiaQ43079370MaRDI QIDQ482875

Zhaoran Wang, Tong Zhang, Han Liu

Publication date: 6 January 2015

Published in: The Annals of Statistics (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1306.4960



Related Items

Global solutions to folded concave penalized nonconvex learning, Sparse recovery via nonconvex regularized \(M\)-estimators over \(\ell_q\)-balls, GSDAR: a fast Newton algorithm for \(\ell_0\) regularized generalized linear models with statistical guarantee, High-Dimensional Learning Under Approximate Sparsity with Applications to Nonsmooth Estimation and Regularized Neural Networks, Hard Thresholding Regularised Logistic Regression: Theory and Algorithms, Point source super-resolution via non-convex \(L_1\) based methods, Distributed testing and estimation under sparse high dimensional models, Bias versus non-convexity in compressed sensing, The Spike-and-Slab LASSO, A unifying framework of high-dimensional sparse estimation with difference-of-convex (DC) regularizations, Nonconvex regularization for sparse neural networks, Penalised robust estimators for sparse and high-dimensional linear models, Folded concave penalized sparse linear regression: sparsity, statistical performance, and algorithmic theory for local solutions, Penalized Estimation of Frailty-Based Illness–Death Models for Semi-Competing Risks, Penalized wavelet nonparametric univariate logistic regression for irregular spaced data, Accelerate the warm-up stage in the Lasso computation via a homotopic approach, Dynamic behavior analysis via structured rank minimization, Unnamed Item, Misspecified nonconvex statistical optimization for sparse phase retrieval, Covariate-Assisted Sparse Tensor Completion, Model selection in high-dimensional quantile regression with seamless \(L_0\) penalty, Unnamed Item, A primal and dual active set algorithm for truncated \(L_1\) regularized logistic regression, An unbiased approach to compressed sensing, Optimal computational and statistical rates of convergence for sparse nonconvex learning problems, Pathwise coordinate optimization for sparse learning: algorithm and theory, I-LAMM for sparse learning: simultaneous control of algorithmic complexity and statistical error, Unnamed Item, Sorted concave penalized regression, Minimum average variance estimation with group Lasso for the multivariate response central mean subspace, A unified primal dual active set algorithm for nonconvex sparse recovery, Optimal sparsity testing in linear regression model, Fully Bayesian logistic regression with hyper-LASSO priors for high-dimensional feature selection, Computational and statistical analyses for robust non-convex sparse regularized regression problem, Analysis of generalized Bregman surrogate algorithms for nonsmooth nonconvex statistical learning, Unnamed Item, A theoretical understanding of self-paced learning, Sample average approximation with sparsity-inducing penalty for high-dimensional stochastic programming, Optimality condition and complexity analysis for linearly-constrained optimization without differentiability on the boundary, Nonregular and minimax estimation of individualized thresholds in high dimension with binary responses


Uses Software


Cites Work