Auto-association by multilayer perceptrons and singular value decomposition (Q1106762): Difference between revisions

From MaRDI portal
Import240304020342 (talk | contribs)
Set profile property.
Created claim: Wikidata QID (P12): Q28292671, #quickstatements; #temporary_batch_1712101902020
Property / Wikidata QID
 
Property / Wikidata QID: Q28292671 / rank
 
Normal rank

Revision as of 01:04, 3 April 2024

scientific article
Language Label Description Also known as
English
Auto-association by multilayer perceptrons and singular value decomposition
scientific article

    Statements

    Auto-association by multilayer perceptrons and singular value decomposition (English)
    0 references
    0 references
    0 references
    1988
    0 references
    The multilayer perceptron, when working in auto-association mode, is sometimes considered as an interesting candidate to perform data compression or dimensionality reduction of the feature space in information processing applications. The present paper shows that, for auto-association, the nonlinearities of the hidden units are useless and that the optimal parameter values can be derived directly by purely linear techniques relying on singular value decomposition and low rank matrix approximation, similar in spirit to the well-known Karhunen-Loève transform. This approach appears thus as an efficient alternative to the general error back-propagation algorithm commonly used for training multilayer perceptrons. Moreover, it also gives a clear interpretation of the role of the different parameters.
    0 references
    neural networks
    0 references
    multilayer perceptron
    0 references
    data compression
    0 references
    dimensionality reduction
    0 references
    information processing
    0 references
    auto-association
    0 references
    nonlinearities of the hidden units
    0 references
    singular value decomposition
    0 references
    low rank matrix approximation
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references
    0 references