##
**Consistency of kernel estimators of heteroscedastic and autocorrelated covariance matrices.**
*(English)*
Zbl 1016.62030

From the introduction: This paper derives conditions for the consistency of kernel estimators of the covariance matrix of a weighted sum of vectors of dependent heterogeneous random variables. A leading example of its application is where an estimator \(\widehat\theta_n\) \((r\times 1)\) of a parameter \(\theta_0\) is known to satisfy
\[
n^{1/2} (\widehat\theta_n -\theta_0)- B_n^{-1} \sum^n_{i=1} X_{nt}(\theta_0) @>p>> 0,\tag{1}
\]
where the random function \(X_{nt} (\theta)\) \((p\times 1)\) has mean zero at \(\theta_0\), and \(B_n\) \((r\times p)\) is some nonrandom matrix that is usually easily estimated. Applying a central limit theorem (CLT) to the second term in (1) leads to
\[
(B_n\Omega_nB_n')^{-1/2} n^{1 /2}(\widehat\theta_n -\theta_0) @>d>>N (0,I_r) \tag{2}
\]
where, letting \(X_{nt}= X_{nt}(\theta_0)\),
\[
\Omega_n= \sum^n_{t=1} \sum^n_{s=1} EX_{nt}X_{ns}'. \tag{3}
\]
A complete asymptotic distribution theory for \(\widehat \theta_n\) must incorporate whatever conditions are needed to ensure consistent estimation of \(\Omega_n\) when the array \(X_{nt}\) is dependently and heterogeneously distributed. An undesirable feature of some other studies is that they impose conditions stronger in important respects than are known to be required for the application of a CLT to the same variables.

In this paper, we bridge the gap between asymptotic normality and consistent covariance matrix estimation. We prove our results for stochastic (sample-dependent) bandwidths for kernel estimators, and also show that in the standard cases, a sufficient condition on the bandwidth for consistency of the variance estimator is that its ratio with the sample size converges to zero. Our central result shows convergence to zero of the difference between the elements of the estimated and the true covariance matrix. There is no need to assume that the true covariance matrix itself converges to a well-defined limit. We argue that relaxing the so-called size conditions on the sequences measuring dependence for the case of covariance matrix estimation for root-\(n\) consistent minimization estimators is not possible, and in that sense, our dependence conditions are the best possible. Finally, we are able to generalize from the root-\(n\) consistency represented in (2), by giving results subject to the condition \[ n^{1/2} \kappa_n (\widehat \theta- \theta_0)= O_p(1)\tag{4} \] where \(\kappa_n\) is a diagonal matrix function of \(n\).

In this paper, we bridge the gap between asymptotic normality and consistent covariance matrix estimation. We prove our results for stochastic (sample-dependent) bandwidths for kernel estimators, and also show that in the standard cases, a sufficient condition on the bandwidth for consistency of the variance estimator is that its ratio with the sample size converges to zero. Our central result shows convergence to zero of the difference between the elements of the estimated and the true covariance matrix. There is no need to assume that the true covariance matrix itself converges to a well-defined limit. We argue that relaxing the so-called size conditions on the sequences measuring dependence for the case of covariance matrix estimation for root-\(n\) consistent minimization estimators is not possible, and in that sense, our dependence conditions are the best possible. Finally, we are able to generalize from the root-\(n\) consistency represented in (2), by giving results subject to the condition \[ n^{1/2} \kappa_n (\widehat \theta- \theta_0)= O_p(1)\tag{4} \] where \(\kappa_n\) is a diagonal matrix function of \(n\).