Abstract

Many papers have addressed the problem of fitting a straight line to a set of bivariate data where both the dependent variable Y and the predictor variable X are assumed to be measured subject to error, the problem that has generally become known as errors-in-variables regression. Several statistical models have been suggested to describe the problem, and many papers contain a discussion on the choice of model, generally going on to give a derivation of estimating equations for the parameters of the models. Rather less work has been published on the precision of the estimators derived, and the work published to date applies only in certain special cases. In the present paper the complete asymptotic variance–covariance matrices for all estimators are derived for the linear structural model, when various assumptions are made about the error variances involved. The equations that are obtained can readily be used in practice, though the algebra needed to derive these equations is somewhat lengthy. With the aid of a simulation exercise we show that these asymptotic variance and covariance terms are remarkably accurate even for relatively small data sets. To establish a consistent notation, we give the estimating equations derived from a rigorous and unified framework underpinned by the principle of maximum likelihood. Attention is focused on the special cases of the model for which solutions have been published previously as a useful check on the algebraic integrity of our results.

This content is only available as a PDF.
This article is published and distributed under the terms of the Oxford University Press, Standard Journals Publication Model (https://dbpia.nl.go.kr/journals/pages/open_access/funder_policies/chorus/standard_publication_model)
You do not currently have access to this article.