Global exponential convergence and stability of gradient-based neural network for online matrix inversion
DOI10.1016/j.amc.2009.06.048zbMath1194.65056OpenAlexW2154410492MaRDI QIDQ734897
Ke Chen, Chaoli Wang, Yanyan Shi, Yu-Nong Zhang
Publication date: 14 October 2009
Published in: Applied Mathematics and Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.amc.2009.06.048
stabilityLyapunov functionneural networknumerical examplesexponential convergencematrix inverseMoore-Penrose pseudoinverseonline matrix inversion
Iterative numerical methods for linear systems (65F10) Linear ordinary differential equations and systems (34A30) Numerical methods for initial value problems involving ordinary differential equations (65L05)
Related Items (17)
Uses Software
Cites Work
- Unnamed Item
- A recurrent neural network for real-time matrix inversion
- Different stochastic algorithms to obtain matrix inversion
- Exploiting Hessian matrix and trust-region algorithm in hyperparameters estimation of Gaussian process
- O(N2)-Operation Approximation of Covariance Matrix Inverse in Gaussian Process Regression Based on Quasi-Newton BFGS Method
- Symbolic matrix inversion with application to electronic circuits
- Regularized image reconstruction using SVD and a neural network method for matrix inversion
- Survey of numerical methods for solution of large systems of linear equations for electromagnetic field problems
- A systolic architecture for fast dense matrix inversion
This page was built for publication: Global exponential convergence and stability of gradient-based neural network for online matrix inversion