Lower Rank Approximation of Matrices by Least Squares with Any Choice of Weights
From MaRDI portal
Publication:3924961
DOI10.2307/1268288zbMath0471.62004OpenAlexW4255897207MaRDI QIDQ3924961
Shmuel Zamir, K. Ruben Gabriel
Publication date: 1979
Full work available at URL: https://doi.org/10.2307/1268288
Multivariate analysis (62H99) Software, source code, etc. for problems pertaining to statistics (62-04) Probabilistic methods, stochastic differential equations (65C99)
Related Items
Graph partitioning by correspondence analysis and taxicab correspondence analysis ⋮ Robust low-rank data matrix approximations ⋮ Resistant lower rank approximation of matrices by iterative majorization ⋮ Weighted least squares fitting using ordinary least squares algorithms ⋮ A unifying theory of tests of rank ⋮ Reduced-rank models for interaction in unequally replicated two-way classifications ⋮ Weighted and robust learning of subspace representations ⋮ Left vs, right representations for solving weighted low-rank approximation problems ⋮ Robust regularized singular value decomposition with application to mortality data ⋮ A robust biplot ⋮ Logistic biplot for nominal data ⋮ A gradual rank increasing process for matrix completion ⋮ Weighted norms in subspace-based methods for time series analysis ⋮ Multiple hypothesis testing adjusted for latent variables, with an application to the AGEMAP gene expression data ⋮ Selecting the number of components in principal component analysis using cross-validation approximations ⋮ Stability of principal component analysis studied by the bootstrap method ⋮ Unnamed Item ⋮ A majorization algorithm for simultaneous parameter estimation in robust exploratory factor analysis ⋮ Statistical inference based on robust low-rank data matrix approximation ⋮ Sparse principal component analysis via regularized low rank matrix approximation ⋮ Fast computation of robust subspace estimators ⋮ GBMs: GLMs with bilinear terms ⋮ Commemoration of the bicentennial of the publication (1805--1806) of the least squares method by Adrien Marie Legendre ⋮ Multiple taxicab correspondence analysis ⋮ Another look at Bayesian analysis of AMMI models for genotype-environment data ⋮ An algorithm for weighted bilinear regression ⋮ L1-norm projection pursuit principal component analysis ⋮ Comparisons among several methods for handling missing data in principal component analysis (PCA) ⋮ Block tensor train decomposition for missing data estimation ⋮ Use of multiple singular value decompositions to analyze complex intracellular calcium ion signals ⋮ Multiple imputation in principal component analysis ⋮ Nearest neighbours in least-squares data imputation algorithms with different missing patterns ⋮ Nearest neighbour approach in the least-squares data imputation algorithms ⋮ A multivariate reduced-rank growth curve model with unbalanced data ⋮ Generalized structured component analysis ⋮ Forecasting time series of inhomogeneous Poisson processes with application to call center workforce management ⋮ Low-rank matrix approximation in the infinity norm ⋮ The incompleteness problem of the APT model ⋮ Principal component analysis with external information on both subjects and variables ⋮ Weighted estimation of AMMI and GGE models ⋮ Intraday forecasts of a volatility index: functional time series methods with dynamic updating ⋮ Practical matrix completion and corruption recovery using proximal alternating robust subspace minimization ⋮ \(M\)-type smoothing spline estimators for principal functions ⋮ Robust centroid method ⋮ Structured total least squares and \(L_ 2\) approximation problems ⋮ On the Complexity of Robust PCA and ℓ1-Norm Low-Rank Matrix Approximation ⋮ A weighted POD method for elliptic PDEs with random inputs ⋮ Nearest Neighbour in Least Squares Data Imputation Algorithms for Marketing Data ⋮ Some new aspects of taxicab correspondence analysis ⋮ A Tale of Two Matrix Factorizations ⋮ Handling missing values with regularized iterative multiple correspondence analysis
This page was built for publication: Lower Rank Approximation of Matrices by Least Squares with Any Choice of Weights