Regularization and the small-ball method II: complexity dependent error rates
From MaRDI portal
Publication:4637079
zbMath1444.62051arXiv1608.07681MaRDI QIDQ4637079
Guillaume Lecué, Shahar Mendelson
Publication date: 17 April 2018
Full work available at URL: https://arxiv.org/abs/1608.07681
Nonparametric regression and quantile regression (62G08) Learning and adaptive systems in artificial intelligence (68T05) Interacting random processes; statistical mechanics type models; percolation theory (60K35)
Related Items
Simultaneous Phase Retrieval and Blind Deconvolution via Convex Programming, Generic error bounds for the generalized Lasso with sub-exponential data, Robust machine learning by median-of-means: theory and practice, Regularization and the small-ball method. I: Sparse recovery, Estimation bounds and sharp oracle inequalities of regularized procedures with Lipschitz loss functions, Solving equations of random convex functions via anchored regression, Unnamed Item, On the robustness of minimum norm interpolators and regularized empirical risk minimizers
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Performance of empirical risk minimization in linear aggregation
- SLOPE is adaptive to unknown sparsity and asymptotically minimax
- Matrix completion via max-norm constrained optimization
- Upper bounds on product and multiplier empirical processes
- Sparse recovery under weak moment assumptions
- Statistics for high-dimensional data. Methods, theory and applications.
- Exponential screening and optimal rates of sparse estimation
- Estimation of high-dimensional low-rank matrices
- Oracle inequalities in empirical risk minimization and sparse recovery problems. École d'Été de Probabilités de Saint-Flour XXXVIII-2008.
- Nuclear-norm penalization and optimal rates for noisy low-rank matrix completion
- Some limit theorems for empirical processes (with discussion)
- SLOPE-adaptive variable selection via convex optimization
- Minimax risk over \(l_ p\)-balls for \(l_ q\)-error
- Regularization and the small-ball method. I: Sparse recovery
- Weak convergence and empirical processes. With applications to statistics
- The convex geometry of linear inverse problems
- \(\ell _{1}\)-regularized linear regression: persistence and oracle inequalities
- Minimax risks for sparse regressions: ultra-high dimensional phenomenons
- The Lasso as an \(\ell _{1}\)-ball model selection procedure
- Simultaneous analysis of Lasso and Dantzig selector
- Noisy low-rank matrix completion with general sampling distribution
- Gaussian averages of interpolated bodies and applications to approximate reconstruction
- Adapting to unknown sparsity by controlling the false discovery rate
- Exact matrix completion via convex optimization
- Learning without Concentration
- Adaptive Minimax Estimation over Sparse $\ell_q$-Hulls
- Small Ball Probabilities for Linear Images of High-Dimensional Distributions
- Bounding the Smallest Singular Value of a Random Matrix Without Concentration
- Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
- Generalized Cross-Validation as a Method for Choosing a Good Ridge Parameter
- The Supremum of Some Canonical Processes
- The Generic Chaining
- On the Gap Between Restricted Isometry Properties and Sparse Recovery Conditions
- A Remark on the Diameter of Random Sections of Convex Bodies
- Sharp Oracle Inequalities for High-Dimensional Matrix Prediction
- Minimax Rates of Estimation for High-Dimensional Linear Regression Over $\ell_q$-Balls
- Tight Oracle Inequalities for Low-Rank Matrix Recovery From a Minimal Number of Noisy Random Measurements
- Recovering Low-Rank Matrices From Few Coefficients in Any Basis
- The Power of Convex Relaxation: Near-Optimal Matrix Completion
- Restricted strong convexity and weighted matrix completion: Optimal bounds with noise
- Learning Theory