Convergence of batch split-complex backpropagation algorithm for complex-valued neural networks (Q1040120)
From MaRDI portal
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | Convergence of batch split-complex backpropagation algorithm for complex-valued neural networks |
scientific article |
Statements
Convergence of batch split-complex backpropagation algorithm for complex-valued neural networks (English)
0 references
23 November 2009
0 references
Summary: The batch split-complex backpropagation (BSCBP) algorithm for training complex-valued neural networks is considered. For constant learning rate, it is proved that the error function of BSCBP algorithm is monotone during the training iteration process, and the gradient of the error function tends to zero. By adding a moderate condition, the weights sequence itself is also proved to be convergent. A numerical example is given to support the theoretical analysis.
0 references