Oracle posterior contraction rates under hierarchical priors
From MaRDI portal
Publication:2044331
Abstract: We offer a general Bayes theoretic framework to derive posterior contraction rates under a hierarchical prior design: the first-step prior serves to assess the model selection uncertainty, and the second-step prior quantifies the prior belief on the strength of the signals within the model chosen from the first step. In particular, we establish non-asymptotic oracle posterior contraction rates under (i) a local Gaussianity condition on the log likelihood ratio of the statistical experiment, (ii) a local entropy condition on the dimensionality of the models, and (iii) a sufficient mass condition on the second-step prior near the best approximating signal for each model. The first-step prior can be designed generically. The posterior distribution enjoys Gaussian tail behavior and therefore the resulting posterior mean also satisfies an oracle inequality, automatically serving as an adaptive point estimator in a frequentist sense. Model mis-specification is allowed in these oracle rates. The local Gaussianity condition serves as a unified attempt of non-asymptotic Gaussian quantification of the experiments, and can be easily verified in various experiments considered in [GvdV07a] and beyond. The general results are applied in various problems including: (i) trace regression, (ii) shape-restricted isotonic/convex regression, (iii) high-dimensional partially linear regression, (iv) covariance matrix estimation in the sparse factor model, (v) detection of non-smooth polytopal image boundary, and (vi) intensity estimation in a Poisson point process model. These new results serve either as theoretical justification of practical prior proposals in the literature, or as an illustration of the generic construction scheme of a (nearly) minimax adaptive estimator for a complicated experiment.
Recommendations
- Rates of contraction of posterior distributions based on Gaussian process priors
- Oracle convergence rate of posterior under projection prior and Bayesian model selection
- Oracle-type posterior contraction rates in Bayesian inverse problems
- A general framework for Bayes structured linear models
- Adaptive posterior contraction rates for the horseshoe
Cites work
- scientific article; zbMATH DE number 51427 (Why is no real title available?)
- scientific article; zbMATH DE number 1321826 (Why is no real title available?)
- scientific article; zbMATH DE number 1420699 (Why is no real title available?)
- A Bayesian approach for noisy matrix completion: optimal rate under general sampling distribution
- A Bayesian approach to non-parametric monotone function estimation
- A Bayesian nonparametric approach to log-concave density estimation
- A general framework for Bayes structured linear models
- Adaptive Bayesian density estimation with location-scale mixtures
- Adaptive Bayesian estimation using a Gaussian random field with inverse gamma bandwidth
- Adaptive Bayesian inference on the mean of an infinite-dimensional normal distribution
- Adaptive nonparametric Bayesian inference using location-scale mixture priors
- Aggregation and minimax optimality in high-dimensional estimation
- An asymptotic property of model selection criteria
- Approximation and estimation of s-concave densities via Rényi divergences
- Asymptotic behaviour of the empirical Bayes posteriors associated to maximum marginal likelihood estimator
- Asymptotic equivalence for nonparametric regression with non-regular errors
- Asymptotic frequentist coverage properties of Bayesian credible sets for sieve priors
- Bayesian Estimation of the Spectral Density of a Time Series
- Bayesian Isotonic Regression and Trend Analysis
- Bayesian detection of image boundaries
- Bayesian linear regression with sparse priors
- Bayesian monotone regression using Gaussian process projection
- Bayesian nonparametric estimation of the spectral density of a long or intermediate memory Gaussian process
- Bayesian optimal adaptive estimation using a sieve prior
- Bayesian structure learning in graphical models
- Concentration inequalities. A nonasymptotic theory of independence
- Convergence rates for Bayesian density estimation of infinite-dimensional exponential families
- Convergence rates of posterior distributions for non iid observations
- Convergence rates of posterior distributions.
- Covering Numbers for Convex Functions
- Decoding by Linear Programming
- Entropies and rates of convergence for maximum likelihood and Bayes estimation for mixtures of normal densities.
- Estimation of high-dimensional low-rank matrices
- Fundamentals of nonparametric Bayesian inference
- Guaranteed minimum-rank solutions of linear matrix equations via nuclear norm minimization
- Information-theoretic determination of minimax rates of convergence
- Minimax optimal estimation in partially linear additive models under high dimension
- Minimax optimal rates of estimation in high dimensional additive models
- Minimum complexity density estimation
- Misspecification in infinite-dimensional Bayesian statistics
- Needles and straw in a haystack: posterior concentration for possibly sparse sequences
- Nonparametric Bayesian analysis of the compound Poisson prior for support boundary recovery
- Nonparametric Bayesian model selection and averaging
- On Bayesian supremum norm contraction rates
- On adaptive posterior concentration rates
- On coverage and local radial rates of credible sets
- On risk bounds in isotonic and other shape restricted regression problems
- On universal Bayesian adaptation
- Optimal rates of convergence for convex set estimation from support functions
- Posterior concentration rates for empirical Bayes procedures with applications to Dirichlet process mixtures
- Posterior contraction in sparse Bayesian factor models for massive covariance matrices
- Posterior convergence rates for estimating large precision matrices using graphical models
- Posterior convergence rates of Dirichlet mixtures at smooth densities
- Rate exact Bayesian adaptation with modified block priors
- Rate-optimal posterior contraction for sparse PCA
- Rates for Bayesian estimation of location-scale mixtures of super-smooth densities
- Rates of contraction of posterior distributions based on Gaussian process priors
- Rates of convergence for minimum contrast estimators
- Rates of convergence for the posterior distributions of mixtures of betas and adaptive nonparametric estimation of the density
- Rates of convergence of posterior distributions.
- Risk bounds for model selection via penalization
- Statistics for high-dimensional data. Methods, theory and applications.
- Supremum norm posterior contraction and credible sets for nonparametric multivariate regression
- Tight Oracle Inequalities for Low-Rank Matrix Recovery From a Minimal Number of Noisy Random Measurements
- Weak convergence and empirical processes. With applications to statistics
Cited in
(8)- Oracle-type posterior contraction rates in Bayesian inverse problems
- Bayesian fractional posteriors
- Bayesian model selection and the concentration of the posterior of hyperparameters
- A general framework for Bayes structured linear models
- Oracle convergence rate of posterior under projection prior and Bayesian model selection
- Adaptive variational Bayes: optimality, computation and applications
- Lower bound for the oracle projection posterior convergence rate
- Empirical Bayes methods in high dimensions: a survey and ongoing debates
This page was built for publication: Oracle posterior contraction rates under hierarchical priors
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2044331)