Convergence analysis of an augmented algorithm for fully complex-valued neural networks
DOI10.1016/J.NEUNET.2015.05.003zbMATH Open1401.68271OpenAlexW776779855WikidataQ50901832 ScholiaQ50901832MaRDI QIDQ1669147FDOQ1669147
Authors: Huisheng Zhang, Danilo P. Mandic, Dong-po Xu
Publication date: 30 August 2018
Published in: Neural Networks (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.neunet.2015.05.003
Recommendations
- Convergence of batch split-complex backpropagation algorithm for complex-valued neural networks
- Convergence analysis of three classes of Split-complex gradient algorithms for complex-valued recurrent neural networks
- Convergence of an online split-complex gradient algorithm for complex-valued neural networks
- Fully complex conjugate gradient-based neural networks using Wirtinger calculus framework: deterministic convergence and its application
- A note on the complex and bicomplex valued neural networks
convergencecomplex-valued neural networksWirtinger calculusaugmented algorithmunified mean value theorem
Learning and adaptive systems in artificial intelligence (68T05) PDEs in connection with computer science (35Q68)
Cites Work
- Title not available (Why is that?)
- Complex Valued Nonlinear Adaptive Filters
- Second-order analysis of improper complex random vectors and processes
- Complex ICA Using Nonlinear Functions
- Approximation by Fully Complex Multilayer Perceptrons
- Convergence analysis of three classes of Split-complex gradient algorithms for complex-valued recurrent neural networks
- Batch gradient method with smoothing \(L_{1/2}\) regularization for training of feedforward neural networks
- Complex-valued neural networks
- Complex-valued neural networks with multi-valued neurons.
- Convergence analysis of online gradient method for BP neural networks
- Complex domain backpropagation
- An Augmented Extended Kalman Filter Algorithm for Complex-Valued Recurrent Neural Networks
- A Complex-Valued RTRL Algorithm for Recurrent Neural Networks
- Convergence of batch split-complex backpropagation algorithm for complex-valued neural networks
- An augmented CRTRL for complex-valued recurrent neural networks
Cited In (12)
- A note on the complex and bicomplex valued neural networks
- Convergence of batch split-complex backpropagation algorithm for complex-valued neural networks
- Parameter-range-dependent robust stability conditions for quaternion-valued neural networks with time delays
- Arched foot based on conformal complex neural network testing
- Neural iterative learning identifier-based iterative learning controller for time-varying nonlinear systems
- Widely linear complex-valued least mean M-estimate algorithms: design and performance analysis
- Convergence analysis of AdaBound with relaxed bound functions for non-convex optimization
- Convergence of the RMSProp deep learning method with penalty for nonconvex optimization
- Smoothed \(L_{1/2}\) regularizer learning for split-complex valued neuro-fuzzy algorithm for TSK system and its convergence results
- Exponential synchronization for second-order nodes in complex dynamical network with communication time delays and switching topologies
- Fully complex conjugate gradient-based neural networks using Wirtinger calculus framework: deterministic convergence and its application
- Convergence analysis of three classes of Split-complex gradient algorithms for complex-valued recurrent neural networks
This page was built for publication: Convergence analysis of an augmented algorithm for fully complex-valued neural networks
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1669147)