Convergence of batch split-complex backpropagation algorithm for complex-valued neural networks (Q1040120): Difference between revisions

From MaRDI portal
Import240304020342 (talk | contribs)
Set profile property.
Set OpenAlex properties.
Property / OpenAlex ID
 
Property / OpenAlex ID: W2018027033 / rank
 
Normal rank

Revision as of 20:51, 19 March 2024

scientific article
Language Label Description Also known as
English
Convergence of batch split-complex backpropagation algorithm for complex-valued neural networks
scientific article

    Statements

    Convergence of batch split-complex backpropagation algorithm for complex-valued neural networks (English)
    0 references
    0 references
    0 references
    0 references
    23 November 2009
    0 references
    Summary: The batch split-complex backpropagation (BSCBP) algorithm for training complex-valued neural networks is considered. For constant learning rate, it is proved that the error function of BSCBP algorithm is monotone during the training iteration process, and the gradient of the error function tends to zero. By adding a moderate condition, the weights sequence itself is also proved to be convergent. A numerical example is given to support the theoretical analysis.
    0 references

    Identifiers