##
**Estimation up to a change-point.**
*(English)*
Zbl 0779.62018

Summary: Consider the problem of estimating \(\mu\), based on the observation of \(Y_ 0,Y_ 1,\dots,Y_ n\), where it is assumed only that \(Y_ 0,Y_ 1,\dots,Y_ \kappa\) iid \(N(\mu,\sigma^ 2)\) for some unkown \(\kappa\). Unlike the traditional change-point problem, the focus here is not on estimating \(\kappa\), which is now a nuisance parameter. When it is known that \(\kappa=k\), the sample mean \(\overline Y_ k=\sum^ k_ 0 Y_ i/(k+1)\) provides, in addition to wonderful efficiency properties, safety in the sense that it is minimax under squared error loss. Unfortunately, this safety breaks down when \(\kappa\) is unknown; indeed if \(k>\kappa\), the risk of \(\overline Y_ k\) is unbounded.

To address this problem, a generalized minimax criterion is considered whereby each estimator is evaluated by its maximum risk under \(Y_ 0,Y_ 1,\dots,Y_ \kappa\) iid \(N(\mu,\sigma^ 2)\) for each possible value of \(\kappa\). An essentially complete class under this criterion is obtained. Generalizations to other situations such as variance estimation are illustrated.

To address this problem, a generalized minimax criterion is considered whereby each estimator is evaluated by its maximum risk under \(Y_ 0,Y_ 1,\dots,Y_ \kappa\) iid \(N(\mu,\sigma^ 2)\) for each possible value of \(\kappa\). An essentially complete class under this criterion is obtained. Generalizations to other situations such as variance estimation are illustrated.

### MSC:

62F10 | Point estimation |

62C07 | Complete class results in statistical decision theory |

62C20 | Minimax procedures in statistical decision theory |

62A01 | Foundations and philosophical topics in statistics |

62L12 | Sequential estimation |