Hostname: page-component-76fb5796d-wq484 Total loading time: 0 Render date: 2024-04-25T10:37:13.498Z Has data issue: false hasContentIssue false

On a Unified Theory of Estimation in Linear Models—A Review of Recent Results

Published online by Cambridge University Press:  05 September 2017

Abstract

The paper deals with two approaches to the estimation of the parameters β and σ2 in the General Gauss-Markoff (GGM) model represented by the triplet (Y, , σ2V), where E(Y)= and D(Y) =σ2V, when no assumptions are made about the ranks of X and V. One is called Inverse Partition Matrix (IPM) method, which depends on the numerical evaluation of the g-inverse of a partitioned matrix. The second is an analogue of least squares theory applicable even when V is singular, unlike Atiken's method which is applicable only for non-singular V, and is called Unified Least Square (ULS) method.

Type
Part III — Statistical Theory
Copyright
Copyright © 1975 Applied Probability Trust 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

[1] Aitken, A. C. (1934) On least squares and linear combination of observations. Proc. Roy. Soc. Edinburgh A55, 4247.Google Scholar
[2] Anderson, T. W. (1972) Efficient estimation of regression coefficients in time series. Proc. Sixth Berkeley Symp. Math. Statist. Prob. 1, 471482.Google Scholar
[3] Cleveland, W. S. (1970) Projection with wrong inner product and its application to regression with correlated errors and linear filtering of time series. Ann. Math. Statist. 42, 616624.CrossRefGoogle Scholar
[4] Goldman, A. J. and Zelen, M. (1964) Weak generalized inverses and minimum variance linear unbiased estimation. J. Research Nat. Bureau of Standards 68 B, 151172.CrossRefGoogle Scholar
[5] Kruskal, W. (1968) When are Gauss-Markoff and least squares estimators identical? A coordinate free approach. Ann. Math. Statist. 39, 7075.CrossRefGoogle Scholar
[6] Mitra, S. K. and Rao, C. R. (1968) Some results in estimation and tests of linear hypotheses under the Gauss-Markoff model. Sankhya A 30, 281290.Google Scholar
[7] Mitra, S. K. and Rao, C. R. (1969) Conditions for optimality and validity of least squares theory. Ann. Math. Statist. 40, 16171624.CrossRefGoogle Scholar
[8] Mitra, S. K. and Rao, C. R. (1973) Projections under semi-norms and generalized inverse of matrices. Tech. Report, Indiana University, Bloomington.Google Scholar
[9] Mitra, S. K. and Moore, J. B. (1973) Gauss-Markoff estimation with an incorrect dispersion matrix. Sankhya A 35, 139152.Google Scholar
[10] Rao, C. R. (1967) Least squares theory using an estimated dispersion matrix and its application to measurement of signals. Proc. Fifth Berkeley Symp. Math. Statist. Prob. 1, 355372.Google Scholar
[11] Rao, C. R. (1968) A note on a previous lemma in the theory of least squares and some further results. Sankhya A 30, 259266.Google Scholar
[12] Rao, C. R. (1971) Unified theory of linear estimation. Sankhya A 33, 371394.Google Scholar
[13] Rao, C. R. (1972a) A note on the IPM method in the unified theory of linear estimation. Sankhya A 34, 285288.Google Scholar
[14] Rao, C. R. (1972b) Some recent results in linear estimation. Sankhya B 34, 369378.Google Scholar
[15] Rao, C. R. (1973a) Unified theory of least squares. Communications in Statistics 1, 18.CrossRefGoogle Scholar
[16] Rao, C. R. (1973b) Linear Statistical Inference and its Applications. Second Edition. Wiley, New York.CrossRefGoogle Scholar
[17] Rao, C. R. (1973c) Representations of best linear unbiased estimators in the Gauss-Markoff model with a singular dispersion matrix. J. Multivariate Anal. 3, 276292.CrossRefGoogle Scholar
[18] Rao, C. R. (1973d) On a unified theory of estimation in linear models. Mimeograph series 319, Department of Statistics, Purdue University, U.S.A. Google Scholar
[19] Rao, C. R. (1973e) Theory of estimation in the general Gauss-Markoff model. Paper presented at the International Symposium on Statistical Design and Linear Models, Fort Collins.Google Scholar
[20] Rao, C. R. (1974) Projectors, generalized inverses and the BLUE's. To appear.CrossRefGoogle Scholar
[21] Rao, C. R. and Mitra, S. K. (1971a) Generalized Inverse of Matrices and its Applications. Wiley, New York.Google Scholar
[22] Rao, C. R. and Mitra, S. K. (1971b) Further contributions to the theory of generalized inverse of matrices and its applications. Sankhya A 33, 289300.Google Scholar
[23] Seely, J. and Zyskind, G. (1971) Linear spaces and minimum variance unbiased estimation. Ann. Math. Statist. 42, 691703.CrossRefGoogle Scholar
[24] Styan, G. P. H. (Personal communication mentioned in the reference [9]).Google Scholar
[25] Watson, G. S. (1967) Linear least squares regression. Ann. Math. Statist. 38, 16791699.CrossRefGoogle Scholar
[26] Zyskind, G. (1967) On canonical forms, negative covariance matrices and best and simple least squares linear estimator in linear models. Ann. Math. Statist. 38, 10921110.CrossRefGoogle Scholar
[27] Zyskind, G. and Martin, F. B. (1969) On best linear estimation and a general Gauss-Markoff theorem in linear models with arbitrary negative co-variance structure. SIAM J. Appl. Math. 17, 11901202.CrossRefGoogle Scholar
[28] Björck, Å. (1974) A uniform numerical method for linear estimation from general Gauss-Markoff model. To appear.Google Scholar
[29] Mitra, S. K. (1973) Unified least squares approach to linear estimation in a general Gauss-Markoff model. SIAM J. Appl. Math. 25, 671680.CrossRefGoogle Scholar