Summary: The minimum complexity regression estimation framework, due to A. R. Barron [Nonparametric functional estimation and related topics, NATO ASI Ser., Ser. C 335, 561-576 (1991; Zbl 0739.62001)] is a general data-driven methodology for estimating a regression function from a given list of parametric models using independent and identically distributed (i.i.d.) observations. We extend Barron’s regression estimation framework to -dependent observations and to strongly mixing observations.
In particular, we propose abstract minimum complexity regression estimators for dependent observations, which may be adapted to a particular list of parametric models, and establish upper bounds on the statistical risks of the proposed estimators in terms of certain deterministic indices of resolvability. Assuming that the regression function satisfies a certain Fourier-transform-type representation, we examine minimum complexity regression estimators adapted to a list of parametric models based on neural networks and, by using the upper bounds for the abstract estimators, we establish rates of convergence for the statistical risks of these estimators. Also, as a key tool, we extend the classical Bernstein inequality from i.i.d. random variables to -dependent processes and to strongly mixing processes.