Estimation bounds and sharp oracle inequalities of regularized procedures with Lipschitz loss functions
DOI10.1214/18-AOS1742zbMath1466.62289arXiv1702.01402OpenAlexW2598989315WikidataQ127817625 ScholiaQ127817625MaRDI QIDQ2313281
Vincent Cottet, Pierre Alquier, Guillaume Lecué
Publication date: 18 July 2019
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1702.01402
Nonparametric regression and quantile regression (62G08) Estimation in multivariate analysis (62H12) Asymptotic properties of nonparametric inference (62G20) Nonparametric estimation (62G05) Minimax procedures in statistical decision theory (62C20)
Related Items
Cites Work
- Unnamed Item
- Unnamed Item
- Estimation and testing under sparsity. École d'Été de Probabilités de Saint-Flour XLV -- 2015
- SLOPE is adaptive to unknown sparsity and asymptotically minimax
- General nonexact oracle inequalities for classes with a subexponential envelope
- Estimation of high-dimensional low-rank matrices
- Oracle inequalities in empirical risk minimization and sparse recovery problems. École d'Été de Probabilités de Saint-Flour XXXVIII-2008.
- Nuclear-norm penalization and optimal rates for noisy low-rank matrix completion
- SLOPE-adaptive variable selection via convex optimization
- Fast learning rates for plug-in classifiers
- Efficient methods for estimating constrained parameters with applications to regularized (Lasso) logistic regression
- Smooth discrimination analysis
- 1-bit matrix completion: PAC-Bayesian analysis of a variational approximation
- Regularization and the small-ball method. I: Sparse recovery
- A probabilistic approach to the geometry of the \(\ell^n_p\)-ball
- Empirical margin distributions and bounding the generalization error of combined classifiers
- Statistical learning theory and stochastic optimization. Ecole d'Eté de Probabilitiés de Saint-Flour XXXI -- 2001.
- Statistical behavior and consistency of classification methods based on convex risk minimization.
- Optimal aggregation of classifiers in statistical learning.
- The convex geometry of linear inverse problems
- Slope meets Lasso: improved oracle bounds and optimality
- Lasso logistic regression, GSoft and the cyclic coordinate descent algorithm: application to gene expression data
- Estimation bounds and sharp oracle inequalities of regularized procedures with Lipschitz loss functions
- A Bayesian approach for noisy matrix completion: optimal rate under general sampling distribution
- Local Rademacher complexities and oracle inequalities in risk minimization. (2004 IMS Medallion Lecture). (With discussions and rejoinder)
- High-dimensional generalized linear models and the lasso
- \(\ell_1\)-penalized quantile regression in high-dimensional sparse models
- Noisy low-rank matrix completion with general sampling distribution
- Gaussian averages of interpolated bodies and applications to approximate reconstruction
- Empirical minimization
- Local Rademacher complexities
- Bayesian Methods for Low-Rank Matrix Estimation: Short Survey and Theoretical Study
- A Max-Norm Constrained Minimization Approach to 1-Bit Matrix Completion
- On the properties of variational approximations of Gibbs posteriors
- The Group Lasso for Logistic Regression
- Regularization and the small-ball method II: complexity dependent error rates
- Real Analysis and Probability
- 10.1162/1532443041424337
This page was built for publication: Estimation bounds and sharp oracle inequalities of regularized procedures with Lipschitz loss functions