Towards compact neural networks via end-to-end training: a Bayesian tensor approach with automatic rank determination
From MaRDI portal
Publication:5037563
Recommendations
- Hybrid tensor decomposition in neural network compression
- CMD: controllable matrix decomposition with global optimization for deep neural network compression
- Tensor Networks for Dimensionality Reduction and Large-scale Optimization: Part 2 Applications and Future Perspectives
- Multiresolution low-rank tensor formats
- Tensor networks for dimensionality reduction and large-scale optimization. I: Low-rank tensor decompositions
Cites Work
- scientific article; zbMATH DE number 6377992 (Why is no real title available?)
- scientific article; zbMATH DE number 6982943 (Why is no real title available?)
- An introduction to variational methods for graphical models
- Analysis of individual differences in multidimensional scaling via an \(n\)-way generalization of ``Eckart-Young decomposition
- Bayesian tensor regression
- Mean field variational Bayes for elaborate distributions
- Model selection in Bayesian neural networks via horseshoe priors
- PARAFAC: Parallel factor analysis
- Robust low-rank tensor recovery: models and algorithms
- Tensor Decompositions and Applications
- Tensor Regression with Applications in Neuroimaging Data Analysis
- Tensor completion and low-\(n\)-rank tensor recovery via convex optimization
- Tensor-train decomposition
- The horseshoe estimator for sparse signals
Cited In (2)
This page was built for publication: Towards compact neural networks via end-to-end training: a Bayesian tensor approach with automatic rank determination
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5037563)