Convergence of batch split-complex backpropagation algorithm for complex-valued neural networks (Q1040120): Difference between revisions

From MaRDI portal
Importer (talk | contribs)
Created a new Item
 
Created claim: Wikidata QID (P12): Q58647463, #quickstatements; #temporary_batch_1704771653347
Property / Wikidata QID
 
Property / Wikidata QID: Q58647463 / rank
 
Normal rank

Revision as of 05:42, 9 January 2024

scientific article
Language Label Description Also known as
English
Convergence of batch split-complex backpropagation algorithm for complex-valued neural networks
scientific article

    Statements

    Convergence of batch split-complex backpropagation algorithm for complex-valued neural networks (English)
    0 references
    0 references
    0 references
    0 references
    0 references
    23 November 2009
    0 references
    Summary: The batch split-complex backpropagation (BSCBP) algorithm for training complex-valued neural networks is considered. For constant learning rate, it is proved that the error function of BSCBP algorithm is monotone during the training iteration process, and the gradient of the error function tends to zero. By adding a moderate condition, the weights sequence itself is also proved to be convergent. A numerical example is given to support the theoretical analysis.
    0 references
    0 references
    0 references