On the optimality of the aggregate with exponential weights for low temperatures
From MaRDI portal
Publication:1952438
DOI10.3150/11-BEJ408zbMath1456.62136arXiv1303.5180MaRDI QIDQ1952438
Guillaume Lecué, Shahar Mendelson
Publication date: 30 May 2013
Published in: Bernoulli (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1303.5180
aggregationempirical processrandom designGaussian approximationGibbs estimatorsaggregate with exponential weights (AEW)
Central limit and other weak theorems (60F05) General nonlinear regression (62J02) Interacting random processes; statistical mechanics type models; percolation theory (60K35) Approximations to statistical distributions (nonasymptotic) (62E17) Response surface designs (62K20)
Related Items
User-friendly Introduction to PAC-Bayes Bounds, Sharp oracle inequalities for aggregation of affine estimators, Simple proof of the risk bound for denoising by exponential weights for asymmetric noise distributions, Sparse regression learning by aggregation and Langevin Monte-Carlo, Mirror averaging with sparsity priors, General nonexact oracle inequalities for classes with a subexponential envelope, Statistical inference in compound functional models, Deviation optimal learning using greedy \(Q\)-aggregation, On the exponentially weighted aggregate with the Laplace prior
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Sharper lower bounds on the performance of the empirical risk minimization algorithm
- Some limit theorems for empirical processes (with discussion)
- Aggregation via empirical risk minimization
- Obtaining fast error rates in nonconvex situations
- Learning by mirror averaging
- Lectures on probability theory and statistics. École d'Été de Probabilités de Saint-Flour XXVIII - 1998. Summer school, Saint-Flour, France, August 17 -- September 3, 1998
- Combining different procedures for adaptive regression
- Mixing strategies for density estimation.
- Statistical learning theory and stochastic optimization. Ecole d'Eté de Probabilitiés de Saint-Flour XXXI -- 2001.
- Optimal aggregation of classifiers in statistical learning.
- Weak convergence and empirical processes. With applications to statistics
- On the optimality of the empirical risk minimization procedure for the convex aggregation problem
- Local Rademacher complexities and oracle inequalities in risk minimization. (2004 IMS Medallion Lecture). (With discussions and rejoinder)
- Fast learning rates in statistical inference through aggregation
- Optimal rates and adaptation in the single-index model using aggregation
- Aggregation for Gaussian regression
- Simultaneous adaptation to the margin and to complexity in classification
- Empirical minimization
- Information Theory and Mixing Least-Squares Regressions
- Lower Bounds for the Empirical Minimization Algorithm
- Adaptive Regression by Mixing
- Learning Theory and Kernel Machines
- Aggregation by Exponential Weighting and Sharp Oracle Inequalities
- Introduction to nonparametric estimation