On the performance of single-layered neural networks (Q1202336): Difference between revisions
From MaRDI portal
Created a new Item |
Created claim: Wikidata QID (P12): Q44707491, #quickstatements; #temporary_batch_1704597940110 |
||
Property / Wikidata QID | |||
Property / Wikidata QID: Q44707491 / rank | |||
Normal rank |
Revision as of 04:27, 7 January 2024
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | On the performance of single-layered neural networks |
scientific article |
Statements
On the performance of single-layered neural networks (English)
0 references
23 February 1993
0 references
The performance of single-layer neural networks is addressed. First the concept of optimal least-squares is introduced and the optimal training algorithm for a least-squares optimality criterion is described. Next, the well-known outer-product training rule is analyzed. It is shown that the outer-product rule is in fact a sub-optimal least-squares training algorithm, with optimality achieved whenever the training data is orthogonal. This is demonstrated through the development of a useful infinite series expansion of the network equations. Next, the capacity of the single layer neural network is examined. It is shown that the use of the optimal least-squares training algorithm improves the capacity of the network over that of the outer-product rule by a factor of \(\log n\), where \(n\) is the dimension of the input space. Also, it is demonstrated that training networks with respect to their binary output increases the amount of information that can be reliably stored in the network during its training.
0 references
single-layer neural networks
0 references
optimal training algorithm
0 references
least-squares optimality criterion
0 references
outer-product training rule
0 references
sub-optimal least- squares training algorithm
0 references
infinite series expansion of the network equations
0 references
capacity
0 references