Adaptive Bayesian density estimation in \(L^p\)-metrics with Pitman-Yor or normalized inverse-Gaussian process kernel mixtures
From MaRDI portal
Publication:899034
DOI10.1214/14-BA863zbMath1327.62161arXiv1210.8094MaRDI QIDQ899034
Publication date: 21 December 2015
Published in: Bayesian Analysis (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1210.8094
adaptationposterior contraction ratenonparametric density estimationPitman-Yor processsinc kernelnormalized inverse-Gaussian process
Related Items
Novel and simple non-parametric methods of estimating the joint and marginal densities, Supremum norm posterior contraction and credible sets for nonparametric multivariate regression, Adaptive Bayesian density estimation in sup-norm, Bayes and maximum likelihood for \(L^1\)-Wasserstein deconvolution of Laplace mixtures, On adaptive posterior concentration rates, Bayesian adaptation, On posterior contraction of parameters and interpretability in Bayesian mixture modeling, A simple proof of Pitman-Yor's Chinese restaurant process from its stick-breaking representation, Uncertainty quantification for Bayesian CART, Comment: ``Bayes, oracle Bayes and empirical Bayes
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Minimax rates of convergence for Wasserstein deconvolution with supersmooth errors in any dimension
- Rates of contraction for posterior distributions in \(L^{r}\)-metrics, \(1 \leq r \leq \infty\)
- Adaptive nonparametric Bayesian inference using location-scale mixture priors
- On a class of Bayesian nonparametric estimates: I. Density estimates
- Adaptive Bayesian estimation using a Gaussian random field with inverse gamma bandwidth
- On density estimation in the view of Kolmogorov's ideas in approximation theory
- Posterior convergence rates of Dirichlet mixtures at smooth densities
- Convergence rates of posterior distributions for non iid observations
- The tails of probabilities chosen from a Dirichlet prior
- A note on the usefulness of superkernels in density estimation
- Mean integrated square error properties of density estimates
- Exact asymptotic minimax constants for the estimation of analytical functions in \(L_p\)
- The two-parameter Poisson-Dirichlet distribution derived from a stable subordinator
- Posterior consistency of Dirichlet mixtures in density estimation
- Asymptotically efficient estimation of analytic functions in Gaussian noise
- Density estimation by wavelet thresholding
- Estimation of distribution density
- Convergence rates of posterior distributions.
- Entropies and rates of convergence for maximum likelihood and Bayes estimation for mixtures of normal densities.
- Convergence rates for density estimation with Bernstein polynomials.
- Probability inequalities for likelihood ratios and convergence rates of sieve MLEs
- Adaptive Bayesian density estimation with location-scale mixtures
- Posterior rates of convergence for Dirichlet mixtures of exponential power densities
- Convergence of latent mixing measures in finite and infinite mixture models
- On the stick-breaking representation of normalized inverse Gaussian priors
- POSTERIOR CONSISTENCY IN CONDITIONAL DENSITY ESTIMATION BY COVARIATE DEPENDENT MIXTURES
- Sharp Optimality in Density Deconvolution with Dominating Bias. I
- Markov chain Monte Carlo in approximate Dirichlet and beta two-parameter process hierarchical models
- A family of densities derived from the three-parameter Dirichlet process
- Gibbs Sampling Methods for Stick-Breaking Priors
- Adaptive density estimation for clustering with Gaussian mixtures
- Adaptive Bayesian multivariate density estimation with Dirichlet mixtures
- Measure Theory and Probability Theory
- On the Estimation of the Probability Density, I
- Hierarchical Mixture Modeling With Normalized Inverse-Gaussian Priors
- Optimal Transport
- Asymptotically local minimax estimation of infinitely smooth density with censored data