Fast rates for general unbounded loss functions: from ERM to generalized Bayes
From MaRDI portal
Publication:4969103
zbMATH Open1498.68238arXiv1605.00252MaRDI QIDQ4969103FDOQ4969103
Authors: Peter D. Grünwald, Nishant A. Mehta
Publication date: 5 October 2020
Full work available at URL: https://arxiv.org/abs/1605.00252
Recommendations
- Posterior concentration and fast convergence rates for generalized Bayesian learning
- Relative deviation learning bounds and generalization with unbounded loss functions
- Fast rates in statistical and online learning
- Risk bounds for statistical learning
- Fast learning rates in statistical inference through aggregation
Statistical aspects of information-theoretic topics (62B10) Bayesian inference (62F15) Density estimation (62G07) Classification and discrimination; cluster analysis (statistical aspects) (62H30) Learning and adaptive systems in artificial intelligence (68T05)
Cites Work
- Empirical Bayes posterior concentration in sparse high-dimensional linear models
- Title not available (Why is that?)
- Fast learning rates in statistical inference through aggregation
- Optimal learning with \textit{Q}-aggregation
- Title not available (Why is that?)
- Prediction, Learning, and Games
- On Information and Sufficiency
- Learning by mirror averaging
- Bayesian fractional posteriors
- Convergence rates of posterior distributions for non iid observations
- Convergence rates of posterior distributions.
- Challenging the empirical mean and empirical variance: a deviation study
- Loss minimization and parameter estimation with heavy tails
- Title not available (Why is that?)
- Convex Analysis
- Local Rademacher complexities and oracle inequalities in risk minimization. (2004 IMS Medallion Lecture). (With discussions and rejoinder)
- Misspecification in infinite-dimensional Bayesian statistics
- Local Rademacher complexities
- Minimum contrast estimators on sieves: Exponential bounds and rates of convergence
- Inconsistency of Bayesian inference for misspecified linear models, and a proposal for repairing it
- Minimum complexity density estimation
- Robust Bayesian inference via coarsening
- A General Framework for Updating Belief Distributions
- Nonparametric Bayesian model selection and averaging
- The semiparametric Bernstein-von Mises theorem
- Mutual information, metric entropy and cumulative relative entropy risk
- Game theory, maximum entropy, minimum discrepancy and robust Bayesian decision theory
- The consistency of posterior distributions in nonparametric problems
- On Bayesian consistency
- Title not available (Why is that?)
- Optimal aggregation of classifiers in statistical learning.
- Probability inequalities for likelihood ratios and convergence rates of sieve MLEs
- Learning without concentration
- 10.1162/1532443041424300
- Fast learning rates for plug-in classifiers
- Model selection for Gaussian regression with random design
- Rényi Divergence and Kullback-Leibler Divergence
- Information-theoretic determination of minimax rates of convergence
- Title not available (Why is that?)
- Efficient agnostic learning of neural networks with bounded fan-in
- Empirical minimization
- An asymptotic property of model selection criteria
- Follow the leader if you can, hedge if you must
- PAC-Bayesian stochastic model selection
- From \(\varepsilon\)-entropy to KL-entropy: analysis of minimum information complexity density estima\-tion
- A decision-theoretic extension of stochastic complexity and its applications to learning
- Rigorous learning curve bounds from statistical mechanics
- $f$ -Divergence Inequalities
- Information Consistency of Nonparametric Gaussian Process Methods
- Relative deviation learning bounds and generalization with unbounded loss functions
- On aggregation for heavy-tailed classes
- Fast rates in statistical and online learning
- The safe Bayesian. Learning the learning rate via the mixability gap
- Regularization, sparse recovery, and median-of-means tournaments
Cited In (17)
- Generalized Bayes approach to inverse problems with model misspecification
- Anytime-Valid Tests of Conditional Independence Under Model-X
- ERM learning with unbounded sampling
- Relative deviation learning bounds and generalization with unbounded loss functions
- Empirical Bayes inference in sparse high-dimensional generalized linear models
- Posterior consistency for the spectral density of non‐Gaussian stationary time series
- Minimax rates for conditional density estimation via empirical entropy
- Relaxing the i.i.d. assumption: adaptively minimax optimal regret via root-entropic regularization
- The no-free-lunch theorems of supervised learning
- Fast rates in statistical and online learning
- A comparison of learning rate selection methods in generalized Bayesian inference
- Bayesian functional registration of fMRI activation maps
- Posterior concentration and fast convergence rates for generalized Bayesian learning
- Gibbs posterior concentration rates under sub-exponential type losses
- Bernstein-von Mises theorem and misspecified models: a review
- Gibbs posterior convergence and the thermodynamic formalism
- User-friendly Introduction to PAC-Bayes Bounds
This page was built for publication: Fast rates for general unbounded loss functions: from ERM to generalized Bayes
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4969103)