Adaptive Bayesian nonparametric regression using a kernel mixture of polynomials with application to partial linear models
From MaRDI portal
Publication:2297237
Abstract: We propose a kernel mixture of polynomials prior for Bayesian nonparametric regression. The regression function is modeled by local averages of polynomials with kernel mixture weights. We obtain the minimax-optimal rate of contraction of the full posterior distribution up to a logarithmic factor that adapts to the smoothness level of the true function by estimating metric entropies of certain function classes. We also provide a frequentist sieve maximum likelihood estimator with a near-optimal convergence rate. We further investigate the application of the kernel mixture of polynomials to the partial linear model and obtain both the near-optimal rate of contraction for the nonparametric component and the Bernstein-von Mises limit (i.e., asymptotic normality) of the parametric component. The proposed method is illustrated with numerical examples and shows superior performance in terms of computational efficiency, accuracy, and uncertainty quantification compared to the local polynomial regression, DiceKriging, and the robust Gaussian stochastic process.
Recommendations
- Adaptive nonparametric Bayesian inference using location-scale mixture priors
- Adaptive-modal Bayesian nonparametric regression
- Bayesian estimation for nonparametric regression
- Locally adaptive Bayes nonparametric regression via nested Gaussian processes
- Supremum norm posterior contraction and credible sets for nonparametric multivariate regression
Cites work
- scientific article; zbMATH DE number 4098524 (Why is no real title available?)
- scientific article; zbMATH DE number 47282 (Why is no real title available?)
- scientific article; zbMATH DE number 1181283 (Why is no real title available?)
- scientific article; zbMATH DE number 893887 (Why is no real title available?)
- scientific article; zbMATH DE number 5060482 (Why is no real title available?)
- scientific article; zbMATH DE number 3222478 (Why is no real title available?)
- scientific article; zbMATH DE number 3063387 (Why is no real title available?)
- A Partially Linear Model Using a Gaussian Process Prior
- A distribution-free theory of nonparametric regression
- Adaptive Bayesian density estimation with location-scale mixtures
- Adaptive Bayesian estimation using a Gaussian random field with inverse gamma bandwidth
- Adaptive Bayesian multivariate density estimation with Dirichlet mixtures
- Adaptive Bayesian nonparametric regression using a kernel mixture of polynomials with application to partial linear models
- Adaptive estimation of multivariate functions using conditionally Gaussian tensor-product spline priors
- Adaptive nonparametric Bayesian inference using location-scale mixture priors
- An Efficient Semiparametric Estimator for Binary Response Models
- Anisotropic function estimation using multi-bandwidth Gaussian processes
- Bayesian Inference for Semiparametric Regression Using a Fourier Representation
- Bayesian data analysis.
- Bayesian estimation of sparse signals with a continuous spike-and-slab prior
- Bayesian inference for latent biologic structure with determinantal point processes (DPP)
- Bayesian inference with rescaled Gaussian process priors
- Bayesian inverse problems with Gaussian priors
- Computational and Inferential Difficulties with Mixture Posterior Distributions
- Convergence rate of sieve estimates
- Convergence rates for parametric components in a partly linear model
- Convergence rates of posterior distributions for non iid observations
- Convergence rates of posterior distributions.
- Dirichlet-Laplace priors for optimal shrinkage
- Efficient calibration for imperfect computer models
- Entropies and rates of convergence for maximum likelihood and Bayes estimation for mixtures of normal densities.
- Frequentist coverage of adaptive nonparametric Bayesian credible sets
- Fundamentals of nonparametric Bayesian inference
- Gaussian processes for machine learning.
- Information rates of nonparametric Gaussian process methods
- Markov chain Monte Carlo methods and the label switching problem in Bayesian mixture modeling
- Needles and straw in a haystack: posterior concentration for possibly sparse sequences
- Optimal global rates of convergence for nonparametric regression
- Orthogonal Gaussian process models
- Posterior concentration for Bayesian regression trees and forests
- Posterior contraction in sparse Bayesian factor models for massive covariance matrices
- Posterior convergence rates of Dirichlet mixtures at smooth densities
- Rate-optimal posterior contraction for sparse PCA
- Rates of contraction of posterior distributions based on Gaussian process priors
- Robust Gaussian stochastic process emulation
- Root-N-Consistent Semiparametric Regression
- Root-n-consistent estimation of partially linear time series models
- Semiparametric least squares (SLS) and weighted SLS estimation of single-index models
- Supremum norm posterior contraction and credible sets for nonparametric multivariate regression
- The semiparametric Bernstein-von Mises theorem
Cited in
(3)
This page was built for publication: Adaptive Bayesian nonparametric regression using a kernel mixture of polynomials with application to partial linear models
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2297237)