On Best Linear Estimation and General Gauss-Markov Theorem in Linear Models with Arbitrary Nonnegative Covariance Structure
From MaRDI portal
Publication:5588985
DOI10.1137/0117110zbMATH Open0193.47301OpenAlexW2088083065MaRDI QIDQ5588985FDOQ5588985
Authors: George Zyskind, F. B. Martin
Publication date: 1969
Published in: SIAM Journal on Applied Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1137/0117110
Cited In (37)
- Perfect linear models and perfect parametric functions
- Bounds for the trace of the difference of the covariance matrices of the OLSE and BLUE
- Estimation of regression vectors in linear mixed models with Dirichlet process random effects
- On connections among OLSEs and BLUEs of whole and partial parameters under a general linear model
- Projectors and linear estimation in general linear models
- Some further remarks on the singular linear model
- The general Gauss-Markov model with possibly singular dispersion matrix
- Some further remarks on the linear sufficiency in the linear model
- A complete sufficient statistic for the linear model under normality and a singular covariance matrix
- Two matrix-based proofs that the linear estimator \(G y\) is the best linear unbiased estimator
- Linear prediction sufficiency in the misspecified linear model
- A new approach to the concept of a strong unified-least-squares matrix
- On the convergence of an iterative method for the computation of generalized inverse and associated projections †
- All about the \(\bot\) with its applications in the linear statistical models
- Estimation of missing values in the general gauss-maekoff model
- Characterization of the multivariate Gauss-Markoff model with singular covariance matrix and missing values
- Determinaciones de B.L.U.E’s. de ey en el Modelo I (ey=XΘ, Cov Y=Q) segun las situaciones relativas de los subespacios Im x, Im Q Y Ker Q, para Q singular
- A Useful Matrix Decomposition and Its Statistical Applications in Linear Regression
- Another look at the naive estimator in a regression model
- Estimation From Transformed Data Under the Linear Regression Model
- The applicability of ordinary least squares to consistently short distances between taxa in phylogenetic tree construction and the normal distribution test consequences
- Linear models that allow perfect estimation
- THREE RANK FORMULAS ASSOCIATED WITH THE COVARIANCE MATRICES OF THE BLUE AND THE OLSE IN THE GENERAL LINEAR MODEL
- Projections under seminorms and generalized Moore Penrose inverses
- Additional variables and adjusted estimates with arbitrary known variance-covariance structure
- Gauss-Markov and weighted least-squares estimation under a general growth curve model
- Multi-output multilevel best linear unbiased estimators via semidefinite programming
- Löwner-ordering antitonicity of generalized inverses of Hermitian matrices
- On the equivalence of the weighted least squares and the generalised least squares estimators, with applications to kernel smoothing
- Some further results related to reduced singular linear models
- A property of partitioned generalized regression
- Further remarks on permissible covariance structures for simultaneous retention of BLUEs in linear models
- Projector operators in the multivariate Zyskind-Martin model
- Estimating large-scale general linear and seemingly unrelated regressions models after deleting observations
- The general linear model of the generalized singular value decomposition
- The efficiency comparisons between OLSE and BLUE in a singular linear model
- Nonnegative-definite covariance structures for which the blu, wls, and ls estimators are equal
This page was built for publication: On Best Linear Estimation and General Gauss-Markov Theorem in Linear Models with Arbitrary Nonnegative Covariance Structure
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5588985)