Adaptive Bayesian density estimation with location-scale mixtures
From MaRDI portal
Publication:1952098
DOI10.1214/10-EJS584zbMath1329.62188MaRDI QIDQ1952098
Judith Rousseau, Willem Kruijer, Aad W. van der Vaart
Publication date: 27 May 2013
Published in: Electronic Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://projecteuclid.org/euclid.ejs/1289226500
convergence ratesnonparametric density estimationBayesian density estimationlocation-scale mixturesrate-adaptive density estimation
Density estimation (62G07) Asymptotic properties of nonparametric inference (62G20) Bayesian inference (62F15)
Related Items
Empirical Bayes Conditional Density Estimation, Bayesian high-dimensional semi-parametric inference beyond sub-Gaussian errors, On some aspects of the asymptotic properties of Bayesian approaches in nonparametric and semiparametric models, Adaptive Bayesian estimation of conditional discrete-continuous distributions with an application to stock market trading activity, Mixture Models With a Prior on the Number of Components, POSTERIOR CONSISTENCY IN CONDITIONAL DENSITY ESTIMATION BY COVARIATE DEPENDENT MIXTURES, Parameter recovery in two-component contamination mixtures: the \(L^2\) strategy, Posterior consistency in conditional distribution estimation, On adaptive posterior concentration rates, Uniform consistency in nonparametric mixture models, Smoothing and adaptation of shifted Pólya tree ensembles, On posterior consistency of tail index for Bayesian kernel mixture models, Adaptive nonparametric Bayesian inference using location-scale mixture priors, Concentration rate and consistency of the posterior distribution for selected priors under monotonicity constraints, ADAPTIVE BAYESIAN ESTIMATION OF CONDITIONAL DENSITIES, Adaptive Bayesian density estimation in \(L^p\)-metrics with Pitman-Yor or normalized inverse-Gaussian process kernel mixtures, Bayesian adaptation, Consistency of mixture models with a prior on the number of components, Intuitive joint priors for Bayesian linear multilevel models: the R2D2M2 prior, Optimal Bayesian estimation of Gaussian mixtures with growing number of components, Adaptive estimation of multivariate functions using conditionally Gaussian tensor-product spline priors, Bayesian sieve methods: approximation rates and adaptive posterior contraction rates, Convergence rates of variational posterior distributions, Anisotropic function estimation using multi-bandwidth Gaussian processes, Bayesian fractional posteriors, Bayesian Repulsive Gaussian Mixture Model, Consistency of variational Bayes inference for estimation and model selection in mixtures, Rate exact Bayesian adaptation with modified block priors, Oracle posterior contraction rates under hierarchical priors, Adaptive Bayesian nonparametric regression using a kernel mixture of polynomials with application to partial linear models, Consistent online Gaussian process regression without the sample complexity bottleneck, Probabilistic Community Detection With Unknown Number of Communities, Nonasymptotic control of the MLE for misspecified nonparametric hidden Markov models, Construction of credible intervals for nonlinear regression models with unknown error distributions, Data-driven priors and their posterior concentration rates, Rates of contraction with respect to \(L_2\)-distance for Bayesian nonparametric regression, Consistency of the maximum likelihood estimator in seasonal hidden Markov models, Bayesian Optimal Adaptive Estimation Using a Sieve Prior, Unnamed Item, Bayesian regression with nonparametric heteroskedasticity, Posterior contraction rates for deconvolution of Dirichlet-Laplace mixtures
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Adaptive nonparametric Bayesian inference using location-scale mixture priors
- Minimax theory of image reconstruction
- Rates of convergence for the posterior distributions of mixtures of betas and adaptive nonparametric estimation of the density
- From \(\varepsilon\)-entropy to KL-entropy: analysis of minimum information complexity density estima\-tion
- Posterior convergence rates of Dirichlet mixtures at smooth densities
- Nonparametric maximum likelihood estimation by the method of sieves
- Some aspects of Pólya tree distributions for statistical modelling
- A note on the usefulness of superkernels in density estimation
- On Bayesian adaptation
- Posterior consistency of Dirichlet mixtures in density estimation
- Convergence rates of posterior distributions.
- Rates of convergence for the Gaussian mixture sieve.
- Entropies and rates of convergence for maximum likelihood and Bayes estimation for mixtures of normal densities.
- Convergence rates for density estimation with Bernstein polynomials.
- Convergence rates for posterior distributions and adaptive estimation
- Dynamics of Bayesian updating with dependent data and misspecified models
- Posterior rates of convergence for Dirichlet mixtures of exponential power densities
- Kullback Leibler property of kernel mixture priors in Bayesian density estimation
- Posterior convergence rates for Dirichlet mixtures of beta densities
- Information-theoretic upper and lower bounds for statistical estimation
- Bayesian Nonparametrics
- Bayesian Density Estimation and Inference Using Mixtures
- A non asymptotic penalized criterion for Gaussian mixture model selection
- Twicing Kernels and a Small Bias Property of Semiparametric Estimators