EVD dualdating based online subspace learning (Q1718370): Difference between revisions

From MaRDI portal
Created claim: Wikidata QID (P12): Q59067114, #quickstatements; #temporary_batch_1711439739529
ReferenceBot (talk | contribs)
Changed an Item
 
Property / cites work
 
Property / cites work: Principal component analysis. / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3998716 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4856771 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4841334 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Sequential Karhunen-Loeve basis extraction and its application to images / rank
 
Normal rank
Property / cites work
 
Property / cites work: On Updating Problems in Latent Semantic Indexing / rank
 
Normal rank
Property / cites work
 
Property / cites work: Fast low-rank modifications of the thin singular value decomposition / rank
 
Normal rank
Property / cites work
 
Property / cites work: An elementary proof of a theorem of Johnson and Lindenstrauss / rank
 
Normal rank

Latest revision as of 03:25, 18 July 2024

scientific article
Language Label Description Also known as
English
EVD dualdating based online subspace learning
scientific article

    Statements

    EVD dualdating based online subspace learning (English)
    0 references
    0 references
    0 references
    0 references
    8 February 2019
    0 references
    Summary: Conventional incremental PCA methods usually only discuss the situation of adding samples. In this paper, we consider two different cases: deleting samples and simultaneously adding and deleting samples. To avoid the NP-hard problem of downdating SVD without right singular vectors and specific position information, we choose to use EVD instead of SVD, which is used by most IPCA methods. First, we propose an EVD updating and downdating algorithm, called EVD dualdating, which permits simultaneous arbitrary adding and deleting operation, via transforming the EVD of the covariance matrix into a SVD updating problem plus an EVD of a small autocorrelation matrix. A comprehensive analysis is delivered to express the essence, expansibility, and computation complexity of EVD dualdating. A mathematical theorem proves that if the whole data matrix satisfies the low-rank-plus-shift structure, EVD dualdating is an optimal rank-\(k\) estimator under the sequential environment. A selection method based on eigenvalues is presented to determine the optimal rank \(k\) of the subspace. Then, we propose three incremental/decremental PCA methods: EVDD-IPCA, EVDD-DPCA, and EVDD-IDPCA, which are adaptive to the varying mean. Finally, plenty of comparative experiments demonstrate that EVDD-based methods outperform conventional incremental/decremental PCA methods in both efficiency and accuracy.
    0 references
    0 references
    0 references
    0 references

    Identifiers