×

Asymptotic distributions of least squares estimates of coefficients of linear regressions with nonlinear constraints and strong dependence. (Ukrainian, English) Zbl 1164.62305

Teor. Jmovirn. Mat. Stat. 75, 105-120 (2006); translation in Theory Probab. Math. Stat. 75, 121-137 (2007).
This paper deals with estimation of the unknown parameter \(\beta=(\beta_1,\dots,\beta_{n})'\) in a linear regression model with continuous time \(y(t)=\beta'g(t)+\eta(t)\), \(0\leq t\leq T\), with nonlinear constraints \(h_{j}(\beta)\leq0\), \(j=1,\dots,r\), where \(g(t)=(g_1(t),\dots,g_{r}(t))'\) are known functions; \(\eta(t)\), \(t\in R\), is a measurable, continuous in mean square, stationary process of second order with zero mean. The case is considered where \(\eta(t)=G(\varepsilon(t))\), \(t\in R\), where \(G\) is an arbitrary real-valued, measurable, non-random function; \(\varepsilon(t), t\in R\), is a Gaussian stochastic process with zero mean and covariance \(B_{\varepsilon}(t)=(1+t^{r})^{-\alpha/2}\), \(0<\alpha<1\).
The author investigates the solution of the minimization problem for least squares functionals of linear regressions. It is proved that the solution, suitably centred and normalized, converges in law to the solution of a quadratic programming problem. This solution is non-Gaussian.

MSC:

62E20 Asymptotic distribution theory in statistics
62M10 Time series, auto-correlation, regression, etc. in statistics (GARCH)
62F30 Parametric inference under constraints
62F10 Point estimation
90C20 Quadratic programming
PDFBibTeX XMLCite
Full Text: Link