A generalization of the Eckart-Young-Mirsky matrix approximation theorem (Q1091453)
From MaRDI portal
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | A generalization of the Eckart-Young-Mirsky matrix approximation theorem |
scientific article |
Statements
A generalization of the Eckart-Young-Mirsky matrix approximation theorem (English)
0 references
1987
0 references
Let X be an \(n\times p\) matrix with \(n\geq p\) and let \(\| \cdot \|\) be a unitarily invariant matrix norm. Let \(X=(X_ 1,X_ 2)\) where \(X_ 1\) has k columns. The problem considered in this paper is: find a matrix \(\hat X{}_ 2\) such that \(rank(X_ 1,\hat X_ 2)\leq r\) and \[ \| (X_ 1,\hat X_ 2)-(X_ 1,X_ 2)\| =\inf_{rank(X_ 1,\bar X_ 2)\leq \quad r}\| (X_ 1,\bar X_ 2)-(X_ 1,X_ 2)\|. \] This problem was solved by \textit{C. Eckart} and \textit{G. Young} [The approximation of one matrix by another of lower rank, Psychometrika 1, 211-218 (1936)] in the case \(k=0\), for the Frobenius norm. Let \(H_ r(X)\) denote the Eckart-Young solution (with \(H_ r(X)=X\) if \(r>p)\). The authors prove the following: Theorem. Let \(X=(X_ 1,X_ 2)\) where \(X_ 1\) has k columns ad let \(\ell =rank X_ 1\). Let P denote the orthogonal projection onto the column space of X and \(P^{\perp}\) the orthogonal projection onto its orthogonal complement. If \(\ell \leq r\) then the matrix \(\hat X{}_ 2=PX_ 2+H_{r-\ell}(P^{\perp}X_ 2)\) is a solution of the problem above. A number of consequences of this theorem are considered and, in particular, applications to multiple correlations, variance inflation factors and total least squares are given.
0 references
matrix approximation
0 references
unitarily invariant matrix norm
0 references
rank
0 references
singular values
0 references
Frobenius norm
0 references
multiple correlations
0 references
variance inflation factors
0 references
total least squares
0 references