Density regression and uncertainty quantification with Bayesian deep noise neural networks
From MaRDI portal
Publication:6548857
DOI10.1002/STA4.604MaRDI QIDQ6548857FDOQ6548857
Authors: Daiwei Zhang, Tianci Liu, Jian Kang
Publication date: 3 June 2024
Published in: Stat (Search for Journal in Brave)
Cites Work
- MCMC using Hamiltonian dynamics
- The no-U-turn sampler: adaptively setting path lengths in Hamiltonian Monte Carlo
- Title not available (Why is that?)
- Pattern recognition and machine learning.
- Sampling-Based Approaches to Calculating Marginal Densities
- Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images
- Deep learning
- Distribution-free Prediction Bands for Non-parametric Regression
- Stability of the Gibbs sampler for Bayesian hierarchical models
- Simple conditions for the convergence of the Gibbs sampler and Metropolis-Hastings algorithms
- Bayesian Density Regression
- Prediction Intervals for Artificial Neural Networks
- Gibbs Sampling
- Probable networks and plausible predictions — a review of practical Bayesian methods for supervised neural networks
- Geometric ergodicity of random scan Gibbs samplers for hierarchical one-way random effects models
- Convergence analysis of the Gibbs sampler for Bayesian general linear mixed models with improper priors
- Physics-constrained deep learning for high-dimensional surrogate modeling and uncertainty quantification without labeled data
- Error bounds for approximations with deep ReLU networks
- Title not available (Why is that?)
- Deep distribution regression
- A Kernel-Expanded Stochastic Neural Network
This page was built for publication: Density regression and uncertainty quantification with Bayesian deep noise neural networks
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6548857)