Density regression and uncertainty quantification with Bayesian deep noise neural networks
From MaRDI portal
Publication:6548857
Cites Work
- scientific article; zbMATH DE number 6378127 (Why is no real title available?)
- scientific article; zbMATH DE number 6860839 (Why is no real title available?)
- A Kernel-Expanded Stochastic Neural Network
- Bayesian Density Regression
- Convergence analysis of the Gibbs sampler for Bayesian general linear mixed models with improper priors
- Deep distribution regression
- Deep learning
- Distribution-free Prediction Bands for Non-parametric Regression
- Error bounds for approximations with deep ReLU networks
- Geometric ergodicity of random scan Gibbs samplers for hierarchical one-way random effects models
- Gibbs Sampling
- MCMC using Hamiltonian dynamics
- Pattern recognition and machine learning.
- Physics-constrained deep learning for high-dimensional surrogate modeling and uncertainty quantification without labeled data
- Prediction Intervals for Artificial Neural Networks
- Probable networks and plausible predictions — a review of practical Bayesian methods for supervised neural networks
- Sampling-Based Approaches to Calculating Marginal Densities
- Simple conditions for the convergence of the Gibbs sampler and Metropolis-Hastings algorithms
- Stability of the Gibbs sampler for Bayesian hierarchical models
- Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images
- The no-U-turn sampler: adaptively setting path lengths in Hamiltonian Monte Carlo
This page was built for publication: Density regression and uncertainty quantification with Bayesian deep noise neural networks
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6548857)