##
**On the estimation of quadratic functionals.**
*(English)*
Zbl 0729.62076

Summary: We discuss the difficulties of estimating quadratic functionals based on observations Y(t) from the white noise model
\[
Y(t)=\int^{t}_{0}f(u)du+\sigma W(t),\quad t\in [0,1],
\]
where W(t) is a standard Wiener process on [0,1]. The optimal rates of convergence (as \(\sigma\to 0)\) for estimating quadratic functionals under certain geometric constraints are found.

Specifically, the optimal rates of estimating \(\int^{1}_{0}[f^{(k)}(x)]^ 2dx\) under hyperrectangular constraints \(\Sigma =\{f:| x_ j(f)| \leq Cj^{-\alpha}\}\) and weighted \(l_ p\)-body constraints \(\Sigma_ p=\{f:\sum^{\infty}_{1}j^ r| x_ j(f)|^ p \leq C\}\) are computed explicitly, where \(x_ j(f)\) is the j th Fourier-Bessel coefficient of the unknown function f. We develop lower bounds based on testing two highly composite hypercubes and address their advantages. The attainable lower bounds are found by applying the hardest one-dimensional approach as well as the hypercube method.

We demonstrate that for estimating regular quadratic functionals [i.e., the functionals which can be estimated at rate \(O(\sigma^ 2)]\), the difficulties of the estimation are captured by the hardest one- dimensional subproblems, and for estimating nonregular quadratic functionals [i.e., no \(O(\sigma^ 2)\)-consistent estimator exists], the difficulties are captured at certain finite-dimensional (the dimension goes to infinity as \(\sigma\to 0)\) hypercube subproblems.

Specifically, the optimal rates of estimating \(\int^{1}_{0}[f^{(k)}(x)]^ 2dx\) under hyperrectangular constraints \(\Sigma =\{f:| x_ j(f)| \leq Cj^{-\alpha}\}\) and weighted \(l_ p\)-body constraints \(\Sigma_ p=\{f:\sum^{\infty}_{1}j^ r| x_ j(f)|^ p \leq C\}\) are computed explicitly, where \(x_ j(f)\) is the j th Fourier-Bessel coefficient of the unknown function f. We develop lower bounds based on testing two highly composite hypercubes and address their advantages. The attainable lower bounds are found by applying the hardest one-dimensional approach as well as the hypercube method.

We demonstrate that for estimating regular quadratic functionals [i.e., the functionals which can be estimated at rate \(O(\sigma^ 2)]\), the difficulties of the estimation are captured by the hardest one- dimensional subproblems, and for estimating nonregular quadratic functionals [i.e., no \(O(\sigma^ 2)\)-consistent estimator exists], the difficulties are captured at certain finite-dimensional (the dimension goes to infinity as \(\sigma\to 0)\) hypercube subproblems.

### MSC:

62M05 | Markov processes: estimation; hidden Markov models |

62M99 | Inference from stochastic processes |

62C99 | Statistical decision theory |

62C20 | Minimax procedures in statistical decision theory |