##
**On second-order optimality of the observed Fisher information.**
*(English)*
Zbl 0881.62023

Summary: The realized error of an estimate is determined not only by the efficiency of the estimator, but also by chance. For example, suppose that we have observed a bivariate normal vector whose expectation is known to be on a circle. Then, intuitively, the longer that vector happens to be, the more accurately its angle is likely to be estimated. Yet this chance, though its information is contained in the data, cannot be accounted for by the variance of the estimate. One way to capture it is by the direct estimation of the realized error.

We will demonstrate that the squared error of the maximum likelihood estimate, to the extent to which it can be estimated, can be most accurately estimated by the inverse of the observed Fisher information. In relation to this optimality, we will also study the properties of several other estimators, including the inverse of the expected Fisher information, the sandwich estimators, the jackknife and the bootstrap estimators. Unlike the observed Fisher information, these estimators are not optimal.

We will demonstrate that the squared error of the maximum likelihood estimate, to the extent to which it can be estimated, can be most accurately estimated by the inverse of the observed Fisher information. In relation to this optimality, we will also study the properties of several other estimators, including the inverse of the expected Fisher information, the sandwich estimators, the jackknife and the bootstrap estimators. Unlike the observed Fisher information, these estimators are not optimal.