Training a multilayer network with low-memory kernel-and-range projection
From MaRDI portal
Publication:2291074
DOI10.1016/j.jfranklin.2019.11.074zbMath1470.68213OpenAlexW2992346182MaRDI QIDQ2291074
Huiping Zhuang, Kar-Ann Toh, Zhiping Lin
Publication date: 30 January 2020
Published in: Journal of the Franklin Institute (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.jfranklin.2019.11.074
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Nonlinear single layer neural network training algorithm for incremental, nonstationary and distributed learning scenarios
- About the generalized \(LM\)-inverse and the weighted Moore-Penrose inverse
- General forms for the recursive determination of generalized inverses: Unified approach
- An alternative proof of the Greville formula
- Sequential determination of the \(\{1, 4\}\)-inverse of a matrix
- Subset selection for visualization of relevant image fractions for deep learning based semantic image segmentation
- Generalized inverses. Theory and applications.
- Generalized inverse of linear transformations: A geometric approach
- Recursive determination of the generalized Moore-Penrose \(M\)-inverse of a matrix
- Variants of the Greville Formula with Applications to Exact Recursive Least Squares
- Generalized Inverses of Linear Transformations
- Learning representations by back-propagating errors
- Some Applications of the Pseudoinverse of a Matrix
- Discrete-time signal processing. An algebraic approach
This page was built for publication: Training a multilayer network with low-memory kernel-and-range projection