Adaptive Bayesian density estimation with location-scale mixtures
From MaRDI portal
Publication:1952098
DOI10.1214/10-EJS584zbMath1329.62188MaRDI QIDQ1952098
Judith Rousseau, Willem Kruijer, Aad W. van der Vaart
Publication date: 27 May 2013
Published in: Electronic Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://projecteuclid.org/euclid.ejs/1289226500
convergence rates; nonparametric density estimation; Bayesian density estimation; location-scale mixtures; rate-adaptive density estimation
62G07: Density estimation
62G20: Asymptotic properties of nonparametric inference
62F15: Bayesian inference
Related Items
Unnamed Item, Mixture Models With a Prior on the Number of Components, Empirical Bayes Conditional Density Estimation, Probabilistic Community Detection With Unknown Number of Communities, Construction of credible intervals for nonlinear regression models with unknown error distributions, ADAPTIVE BAYESIAN ESTIMATION OF CONDITIONAL DENSITIES, Bayesian sieve methods: approximation rates and adaptive posterior contraction rates, Rate exact Bayesian adaptation with modified block priors, Posterior contraction rates for deconvolution of Dirichlet-Laplace mixtures, Posterior consistency in conditional distribution estimation, Concentration rate and consistency of the posterior distribution for selected priors under monotonicity constraints, Adaptive nonparametric Bayesian inference using location-scale mixture priors, On adaptive posterior concentration rates, Adaptive Bayesian density estimation in \(L^p\)-metrics with Pitman-Yor or normalized inverse-Gaussian process kernel mixtures, Bayesian adaptation, Bayesian fractional posteriors, Consistency of variational Bayes inference for estimation and model selection in mixtures, Adaptive estimation of multivariate functions using conditionally Gaussian tensor-product spline priors, Oracle posterior contraction rates under hierarchical priors, Consistent online Gaussian process regression without the sample complexity bottleneck, Nonasymptotic control of the MLE for misspecified nonparametric hidden Markov models, Parameter recovery in two-component contamination mixtures: the \(L^2\) strategy, Convergence rates of variational posterior distributions, Adaptive Bayesian nonparametric regression using a kernel mixture of polynomials with application to partial linear models, Data-driven priors and their posterior concentration rates, Rates of contraction with respect to \(L_2\)-distance for Bayesian nonparametric regression, Consistency of the maximum likelihood estimator in seasonal hidden Markov models, Bayesian regression with nonparametric heteroskedasticity, On posterior consistency of tail index for Bayesian kernel mixture models, Anisotropic function estimation using multi-bandwidth Gaussian processes, Bayesian Optimal Adaptive Estimation Using a Sieve Prior, POSTERIOR CONSISTENCY IN CONDITIONAL DENSITY ESTIMATION BY COVARIATE DEPENDENT MIXTURES, Bayesian Repulsive Gaussian Mixture Model, On some aspects of the asymptotic properties of Bayesian approaches in nonparametric and semiparametric models
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Adaptive nonparametric Bayesian inference using location-scale mixture priors
- Minimax theory of image reconstruction
- Rates of convergence for the posterior distributions of mixtures of betas and adaptive nonparametric estimation of the density
- From \(\varepsilon\)-entropy to KL-entropy: analysis of minimum information complexity density estima\-tion
- Posterior convergence rates of Dirichlet mixtures at smooth densities
- Nonparametric maximum likelihood estimation by the method of sieves
- Some aspects of Pólya tree distributions for statistical modelling
- A note on the usefulness of superkernels in density estimation
- On Bayesian adaptation
- Posterior consistency of Dirichlet mixtures in density estimation
- Convergence rates of posterior distributions.
- Rates of convergence for the Gaussian mixture sieve.
- Entropies and rates of convergence for maximum likelihood and Bayes estimation for mixtures of normal densities.
- Convergence rates for density estimation with Bernstein polynomials.
- Convergence rates for posterior distributions and adaptive estimation
- Dynamics of Bayesian updating with dependent data and misspecified models
- Posterior rates of convergence for Dirichlet mixtures of exponential power densities
- Kullback Leibler property of kernel mixture priors in Bayesian density estimation
- Posterior convergence rates for Dirichlet mixtures of beta densities
- Information-theoretic upper and lower bounds for statistical estimation
- Bayesian Nonparametrics
- Bayesian Density Estimation and Inference Using Mixtures
- A non asymptotic penalized criterion for Gaussian mixture model selection
- Twicing Kernels and a Small Bias Property of Semiparametric Estimators