Towards compact neural networks via end-to-end training: a Bayesian tensor approach with automatic rank determination
From MaRDI portal
Publication:5037563
DOI10.1137/21M1391444OpenAlexW3118185732MaRDI QIDQ5037563FDOQ5037563
Authors: Cole Hawkins, Xing Liu, Zheng Zhang
Publication date: 1 March 2022
Published in: SIAM Journal on Mathematics of Data Science (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2010.08689
Recommendations
- Hybrid tensor decomposition in neural network compression
- CMD: controllable matrix decomposition with global optimization for deep neural network compression
- Tensor Networks for Dimensionality Reduction and Large-scale Optimization: Part 2 Applications and Future Perspectives
- Multiresolution low-rank tensor formats
- Tensor networks for dimensionality reduction and large-scale optimization. I: Low-rank tensor decompositions
Cites Work
- PARAFAC: Parallel factor analysis
- The horseshoe estimator for sparse signals
- Analysis of individual differences in multidimensional scaling via an \(n\)-way generalization of ``Eckart-Young decomposition
- Title not available (Why is that?)
- Tensor Decompositions and Applications
- Tensor Regression with Applications in Neuroimaging Data Analysis
- Tensor-train decomposition
- Bayesian tensor regression
- Tensor completion and low-\(n\)-rank tensor recovery via convex optimization
- Mean field variational Bayes for elaborate distributions
- An introduction to variational methods for graphical models
- Robust low-rank tensor recovery: models and algorithms
- Title not available (Why is that?)
- Model selection in Bayesian neural networks via horseshoe priors
Uses Software
This page was built for publication: Towards compact neural networks via end-to-end training: a Bayesian tensor approach with automatic rank determination
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5037563)