Convergence of batch split-complex backpropagation algorithm for complex-valued neural networks (Q1040120): Difference between revisions

From MaRDI portal
Created claim: Wikidata QID (P12): Q58647463, #quickstatements; #temporary_batch_1704771653347
Added link to MaRDI item.
links / mardi / namelinks / mardi / name
 

Revision as of 22:44, 30 January 2024

scientific article
Language Label Description Also known as
English
Convergence of batch split-complex backpropagation algorithm for complex-valued neural networks
scientific article

    Statements

    Convergence of batch split-complex backpropagation algorithm for complex-valued neural networks (English)
    0 references
    0 references
    0 references
    0 references
    23 November 2009
    0 references
    Summary: The batch split-complex backpropagation (BSCBP) algorithm for training complex-valued neural networks is considered. For constant learning rate, it is proved that the error function of BSCBP algorithm is monotone during the training iteration process, and the gradient of the error function tends to zero. By adding a moderate condition, the weights sequence itself is also proved to be convergent. A numerical example is given to support the theoretical analysis.
    0 references

    Identifiers