Efficient approximations of the fisher matrix in neural networks using kronecker product singular value decomposition
From MaRDI portal
Publication:6127062
DOI10.1051/proc/202373218arXiv2201.10285OpenAlexW4281255645MaRDI QIDQ6127062
Unnamed Author, Unnamed Author, Ibtihel Ben Gharbia, Mounir Haddou, Valérie Garès, Quang Huy Tran
Publication date: 10 April 2024
Published in: ESAIM: Proceedings and Surveys (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2201.10285
Artificial neural networks and deep learning (68T07) Numerical computation of eigenvalues and eigenvectors of matrices (65F15)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- On the limited memory BFGS method for large scale optimization
- The ubiquitous Kronecker product
- Numerical Methods for Large Eigenvalue Problems
- Reducing the Dimensionality of Data with Neural Networks
- Fast Curvature Matrix-Vector Products for Second-Order Gradient Descent
- Riemannian metrics for neural networks I: feedforward networks
- Calculating the Singular Values and Pseudo-Inverse of a Matrix
- A Family of Variable-Metric Methods Derived by Variational Means
- A new approach to variable metric algorithms
- The Convergence of a Class of Double-rank Minimization Algorithms 1. General Considerations
- Conditioning of Quasi-Newton Methods for Function Minimization
- A Stochastic Approximation Method
This page was built for publication: Efficient approximations of the fisher matrix in neural networks using kronecker product singular value decomposition