On the performance of single-layered neural networks (Q1202336): Difference between revisions

From MaRDI portal
Added link to MaRDI item.
Set OpenAlex properties.
 
(4 intermediate revisions by 3 users not shown)
Property / reviewed by
 
Property / reviewed by: Kevin L. Moore / rank
Normal rank
 
Property / reviewed by
 
Property / reviewed by: Kevin L. Moore / rank
 
Normal rank
Property / MaRDI profile type
 
Property / MaRDI profile type: MaRDI publication profile / rank
 
Normal rank
Property / cites work
 
Property / cites work: Regression and the Moore-Penrose pseudoinverse / rank
 
Normal rank
Property / cites work
 
Property / cites work: A simple neural network generating an interactive memory / rank
 
Normal rank
Property / cites work
 
Property / cites work: Neural networks and physical systems with emergent collective computational abilities. / rank
 
Normal rank
Property / cites work
 
Property / cites work: Correlation Matrix Memories / rank
 
Normal rank
Property / cites work
 
Property / cites work: The capacity of the Hopfield associative memory / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4200184 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5645536 / rank
 
Normal rank
Property / full work available at URL
 
Property / full work available at URL: https://doi.org/10.1007/bf00203135 / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W2009035571 / rank
 
Normal rank

Latest revision as of 10:02, 30 July 2024

scientific article
Language Label Description Also known as
English
On the performance of single-layered neural networks
scientific article

    Statements

    On the performance of single-layered neural networks (English)
    0 references
    23 February 1993
    0 references
    The performance of single-layer neural networks is addressed. First the concept of optimal least-squares is introduced and the optimal training algorithm for a least-squares optimality criterion is described. Next, the well-known outer-product training rule is analyzed. It is shown that the outer-product rule is in fact a sub-optimal least-squares training algorithm, with optimality achieved whenever the training data is orthogonal. This is demonstrated through the development of a useful infinite series expansion of the network equations. Next, the capacity of the single layer neural network is examined. It is shown that the use of the optimal least-squares training algorithm improves the capacity of the network over that of the outer-product rule by a factor of \(\log n\), where \(n\) is the dimension of the input space. Also, it is demonstrated that training networks with respect to their binary output increases the amount of information that can be reliably stored in the network during its training.
    0 references
    single-layer neural networks
    0 references
    optimal training algorithm
    0 references
    least-squares optimality criterion
    0 references
    outer-product training rule
    0 references
    sub-optimal least- squares training algorithm
    0 references
    infinite series expansion of the network equations
    0 references
    capacity
    0 references

    Identifiers