Convergence of batch split-complex backpropagation algorithm for complex-valued neural networks (Q1040120)

From MaRDI portal
Revision as of 04:42, 9 January 2024 by Daniel (talk | contribs) (‎Created claim: Wikidata QID (P12): Q58647463, #quickstatements; #temporary_batch_1704771653347)
scientific article
Language Label Description Also known as
English
Convergence of batch split-complex backpropagation algorithm for complex-valued neural networks
scientific article

    Statements

    Convergence of batch split-complex backpropagation algorithm for complex-valued neural networks (English)
    0 references
    0 references
    0 references
    0 references
    23 November 2009
    0 references
    Summary: The batch split-complex backpropagation (BSCBP) algorithm for training complex-valued neural networks is considered. For constant learning rate, it is proved that the error function of BSCBP algorithm is monotone during the training iteration process, and the gradient of the error function tends to zero. By adding a moderate condition, the weights sequence itself is also proved to be convergent. A numerical example is given to support the theoretical analysis.
    0 references

    Identifiers