# zbMATH — the first resource for mathematics

Linear model with variances depending on the mean value. (English) Zbl 0764.62055
Let $$(Y,X\beta,\Sigma)$$ be a linear regression model. The result of the observations is a realization of a random vector $$Y_{n,1}$$, whose mean value is $$E_ \beta Y=X\beta$$, $$X_{n,k}$$ is a known design matrix, $$\beta_{k,1}\in R^ k$$ the vector of unknown parameters, the covariance matrix of the vector $$Y$$ depends on $$\beta$$, $\Sigma=\sigma^ 2\Sigma(\beta)=\text{diag}(\sigma^ 2(a+b| e_ i' X\beta|^ 2))_{1\leq i\leq n},$ where $$\sigma^ 2$$, $$a$$ and $$b$$ are known positive constants, and $$e_ i'$$ is the transpose of the $$i$$th unity vector.
The $$\beta_ 0$$-locally best linear unbiased estimator of a linear function of the parameter $$\beta$$ is obtained.
Reviewer: N.Leonenko (Kiev)

##### MSC:
 62J05 Linear regression; mixed models 62H12 Estimation in multivariate analysis
Full Text:
##### References:
 [1] KUBÁČEK L.: Foundations of Estimation Theory. Elsevier, Amsterdam-Oxford-NewYork-Tokyo, 1988. · Zbl 0698.62004 [2] RAO C. R., MITRA S. K: Generalized Inverse of Matrices and its Applications. J.Wiley, New York, 1971. · Zbl 0236.15005
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.