Regularization and the small-ball method. I: Sparse recovery
From MaRDI portal
Publication:1750281
DOI10.1214/17-AOS1562zbMath1403.60085arXiv1601.05584OpenAlexW2963991711MaRDI QIDQ1750281
Guillaume Lecué, Shahar Mendelson
Publication date: 18 May 2018
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1601.05584
Nonparametric regression and quantile regression (62G08) Interacting random processes; statistical mechanics type models; percolation theory (60K35)
Related Items
On cross-validated Lasso in high dimensions, On the prediction loss of the Lasso in the partially labeled setting, Safe Rules for the Identification of Zeros in the Solutions of the SLOPE Problem, Simultaneous Phase Retrieval and Blind Deconvolution via Convex Programming, Generic error bounds for the generalized Lasso with sub-exponential data, Regularization, sparse recovery, and median-of-means tournaments, Debiasing convex regularized estimators and interval estimation in linear models, Matrix completion with nonconvex regularization: spectral operators and scalable algorithms, Robust machine learning by median-of-means: theory and practice, ERM and RERM are optimal estimators for regression problems when malicious outliers corrupt the labels, Improved bounds for square-root Lasso and square-root slope, Unnamed Item, Regularization and the small-ball method. I: Sparse recovery, On the subdifferential of symmetric convex functions of the spectrum for symmetric and orthogonally decomposable tensors, A MOM-based ensemble method for robustness, subsampling and hyperparameter tuning, Iteratively reweighted \(\ell_1\)-penalized robust regression, Second-order Stein: SURE for SURE and other applications in high-dimensional inference, Sparse recovery from extreme eigenvalues deviation inequalities, Estimation bounds and sharp oracle inequalities of regularized procedures with Lipschitz loss functions, Convergence rates of least squares regression estimators with heavy-tailed errors, Solving equations of random convex functions via anchored regression, Augmented minimax linear estimation, Regularization and the small-ball method II: complexity dependent error rates, On Monte-Carlo methods in convex stochastic optimization, Unnamed Item, On the robustness of minimum norm interpolators and regularized empirical risk minimizers
Cites Work
- Unnamed Item
- Unnamed Item
- On asymptotically optimal confidence regions and tests for high-dimensional models
- SLOPE is adaptive to unknown sparsity and asymptotically minimax
- Upper bounds on product and multiplier empirical processes
- Sparse recovery under weak moment assumptions
- Statistics for high-dimensional data. Methods, theory and applications.
- Estimation of high-dimensional low-rank matrices
- Oracle inequalities in empirical risk minimization and sparse recovery problems. École d'Été de Probabilités de Saint-Flour XXXVIII-2008.
- Nuclear-norm penalization and optimal rates for noisy low-rank matrix completion
- SLOPE-adaptive variable selection via convex optimization
- Lasso-type recovery of sparse representations for high-dimensional data
- Characterization of the subdifferential of some matrix norms
- Learning without concentration for general loss functions
- Regularization and the small-ball method. I: Sparse recovery
- Simultaneous analysis of Lasso and Dantzig selector
- High-dimensional generalized linear models and the lasso
- Sup-norm convergence rate and sign concentration property of Lasso and Dantzig estimators
- Confidence sets in sparse regression
- High-dimensional graphs and variable selection with the Lasso
- Learning without Concentration
- Small Ball Probabilities for Linear Images of High-Dimensional Distributions
- Bounding the Smallest Singular Value of a Random Matrix Without Concentration
- Guaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear Norm Minimization
- Regularization and the small-ball method II: complexity dependent error rates
- A Remark on the Diameter of Random Sections of Convex Bodies
- Asymptotic Geometric Analysis, Part I
- Tight Oracle Inequalities for Low-Rank Matrix Recovery From a Minimal Number of Noisy Random Measurements
- Recovering Low-Rank Matrices From Few Coefficients in Any Basis
- Upper and Lower Bounds for Stochastic Processes
- Restricted strong convexity and weighted matrix completion: Optimal bounds with noise
- Weakly decomposable regularization penalties and structured sparsity
- Confidence Intervals for Low Dimensional Parameters in High Dimensional Linear Models
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers