Bayesian distillation of deep learning models
From MaRDI portal
Publication:2069701
DOI10.1134/S0005117921110023zbMATH Open1491.68180OpenAlexW4200392901MaRDI QIDQ2069701FDOQ2069701
Authors: A. V. Grabovoy, Vadim Strijov
Publication date: 21 January 2022
Published in: Automation and Remote Control (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1134/s0005117921110023
Recommendations
- Deep learning: a Bayesian perspective
- Bayesian learning for neural networks
- Bayesian Deep Net GLM and GLMM
- Learning Summary Statistic for Approximate Bayesian Computation via Deep Neural Network
- Priors in Bayesian Deep Learning: A Review
- Bayesian learning for recurrent neural networks
- Deep variational inference
- Bayesian deep convolutional encoder-decoder networks for surrogate modeling and uncertainty quantification
- Bayesian inference in neural networks
Cites Work
Cited In (5)
- Marginally Calibrated Deep Distributional Regression
- Probabilistic interpretation of the distillation problem
- Measure transformer semantics for Bayesian machine learning
- Gradient methods for optimizing metaparameters in the knowledge distillation problem
- Learning Summary Statistic for Approximate Bayesian Computation via Deep Neural Network
Uses Software
This page was built for publication: Bayesian distillation of deep learning models
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2069701)