High dimensional neurocomputing. Growth, appraisal and applications (Q2263180)
From MaRDI portal
![]() | This is the item page for this Wikibase entity, intended for internal use and editing purposes. Please use this page instead for the normal view: High dimensional neurocomputing. Growth, appraisal and applications |
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | High dimensional neurocomputing. Growth, appraisal and applications |
scientific article |
Statements
High dimensional neurocomputing. Growth, appraisal and applications (English)
0 references
17 March 2015
0 references
In the first chapter there is a brief discussion of the high-dimensional neural networks. They are defined as working with complex (imaginary), quaternions or octonions-valued information. In the complex-valued neural networks the inputs, the weights and the outputs are represented by complex numbers. Such neural networks can be used for example to represent two-dimensional signals. It is shown by empirical experiments that complex-valued neural networks converge faster. The extension of real-valued activation functions into the complex domain is introduced followed by the definition of error function in the complex domain. A modified backpropagation algorithm is presented. Then higher-order neurons are introduced. Their output is a vector with odd dimension, since complex numbers, quaternions and octonions have even dimensions. Finally, complex-based PCA and ICA are introduced, based on Hilbert transformation. Empirical experiments are described with examples of face recognition. The book is composed of seven chapters, the structure of the chapters is similar to articles, and no index is provided. The book offers a fast and easy introduction to the domain of complex-based neural networks.
0 references
neural networks
0 references
PCA
0 references
ICA
0 references
complex numbers
0 references
quaternions
0 references
face recognition
0 references