Quantile regression neural networks: a Bayesian approach
From MaRDI portal
Abstract: This article introduces a Bayesian neural network estimation method for quantile regression assuming an asymmetric Laplace distribution (ALD) for the response variable. It is shown that the posterior distribution for feedforward neural network quantile regression is asymptotically consistent under a misspecified ALD model. This consistency proof embeds the problem from density estimation domain and uses bounds on the bracketing entropy to derive the posterior consistency over Hellinger neighborhoods. This consistency result is shown in the setting where the number of hidden nodes grow with the sample size. The Bayesian implementation utilizes the normal-exponential mixture representation of the ALD density. The algorithm uses Markov chain Monte Carlo (MCMC) simulation technique - Gibbs sampling coupled with Metropolis-Hastings algorithm. We have addressed the issue of complexity associated with the afore-mentioned MCMC implementation in the context of chain convergence, choice of starting values, and step sizes. We have illustrated the proposed method with simulation studies and real data examples.
Recommendations
- Bayesian quantile regression
- Fully Bayesian estimation of simultaneous regression quantiles under asymmetric Laplace distribution specification
- Gibbs sampling methods for Bayesian quantile regression
- Bayesian quantile regression with approximate likelihood
- A Bayesian nonparametric approach to inference for quantile regression
Cites work
- scientific article; zbMATH DE number 1666100 (Why is no real title available?)
- scientific article; zbMATH DE number 3983087 (Why is no real title available?)
- scientific article; zbMATH DE number 3177183 (Why is no real title available?)
- scientific article; zbMATH DE number 88839 (Why is no real title available?)
- scientific article; zbMATH DE number 149062 (Why is no real title available?)
- scientific article; zbMATH DE number 1219018 (Why is no real title available?)
- scientific article; zbMATH DE number 3442988 (Why is no real title available?)
- A Bayesian Semiparametric Accelerated Failure Time Model
- A Finite Smoothing Algorithm for Linear $l_1 $ Estimation
- A Three-Parameter Asymmetric Laplace Distribution and Its Extension
- A new polynomial-time algorithm for linear programming
- Approximation by superpositions of a sigmoidal function
- Bayesian Semiparametric Median Regression Modeling
- Bayesian learning for neural networks
- Bayesian methods for neural networks and related models
- Bayesian quantile regression
- Brq: an R package for Bayesian quantile regression
- Gibbs sampling methods for Bayesian quantile regression
- Goodness of Fit and Related Inference Processes for Quantile Regression
- Inference from iterative simulation using multiple sequences
- Multilayer feedforward networks are universal approximators
- Non-Gaussian Ornstein-Uhlenbeck-based models and some of their uses in financial economics. (With discussion)
- Noninformative priors for one-parameter item response models
- Posterior consistency of Bayesian quantile regression based on the misspecified asymmetric Laplace density
- Probability inequalities for likelihood ratios and convergence rates of sieve MLEs
- Quantile regression.
- Regression Quantiles
- Statistical Analysis of Financial Data in S-Plus
- The consistency of posterior distributions in nonparametric problems
- Weak convergence and empirical processes. With applications to statistics
Cited in
(3)
This page was built for publication: Quantile regression neural networks: a Bayesian approach
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2241709)