The prejudices of least squares, principal components and common factors schemes (Q1116594)
From MaRDI portal
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | The prejudices of least squares, principal components and common factors schemes |
scientific article |
Statements
The prejudices of least squares, principal components and common factors schemes (English)
0 references
1989
0 references
We prove mathematically that the least squares regression scheme is of little use for identification from inexact data, even in the low noise case. Its results depend solely on the prejudices concerning which subsets of variables are chosen as ``regressands'' and which as ``regressors'' from a given set of data variables. In practice least squares regression results are always biased and depend on the relative noise levels, while even the signs of the ``estimates'' are completely determind by the arbitrary choice of the regressands if the number of underlying relationships is misspecified. The principal components (or statistical common factor) scheme fares a similar fate, since the choice of how many principal components (or common factors) to retain is essentially prejudiced and not determined by the data. Both schemes produce artificial and unexplained correlations among the residuals. For the principal components scheme this occurs because it usually violates Wilson's inequality. A fortiori we prove exactly why in practice factor indeterminancy occurs. The exact (ideal) multiple common factor scheme of \textit{R. Frisch} [Statistical confluence analysis by means of complete regression systems (1934; Zbl 0011.21903)] and \textit{L. L. Thurstone} [The vectors of mind, Chicago (1935)] has not yet been solved, except for small numbers (six and less) of data variables.
0 references
least squares regression
0 references
principal components
0 references
Wilson's inequality
0 references
multiple common factor scheme
0 references
0 references
0 references