×

A location invariant moment-type estimator. I. (English) Zbl 1193.62052

Teor. Jmovirn. Mat. Stat. 76, 22-30 (2007) and Theory Probab. Math. Stat. 76, 23-31 (2008).
Let \(X_1,X_2,\dots,X_n\) be i.i.d. random variables with common distribution function (d.f.) \(F(x)\) and let \(X_{1,n},X_{2,n},\dots,X_{n,n}\) be the associated order statistics. If there exist some numbers \(a_n>0\), \(b_n\in\mathbb R\) and some non-degenerate distribution \(G(x)\) such that \[ P\{X_{n,n}\leq a_n x+b_n\}=F^n(x)\to G(x),\;n\to\infty, \] then \(G(x)\) is equivalent to \[ G_{\gamma}(x)=\exp\{-(1+\gamma x)^{-1/\gamma}\},\;1+\gamma x>0,\;\gamma\not=0, \text{or}\;G_{\gamma}(x)=\exp\{-\exp(-x)\},\;x\in\mathbb R,\;\gamma=0. \] In this case \(F(x)\) belongs to the domain of attraction of an extreme value d.f. \(G_{\gamma}\) and \(\gamma\) is referred to as the extreme value index.
In the last two decades many estimators of the extreme value index \(\gamma \in\mathbb R\) are proposed that use upper order statistics. A.L.M. Dekkers, J.H.J. Einmahl and L. De Haan [Ann. Stat. 17, No. 4, 1833–1855 (1989; Zbl 0701.62029)] proposed the moment-type estimators \[ \hat{\gamma}_{n}^{M}(k)= M_{n}^{(1)} +1-2^{-1}\left\{ 1-\frac{(M_{n}^{(1)})^2}{M_{n}^{(2)}} \right\}^{-1},\;M_{n}^{(j)}=k^{-1}\sum_{i=0}^{k-1}\left(\log\frac{X_{n-i,n}}{X_{n-k,n}}\right)^j,\;j=1,2. \] The above estimators are scale invariant but not location invariant. M.I. Fraga Alves [Extremes 4, No. 3, 199–217 (2001; Zbl 1053.62063)] proposed a location invariant Hill type estimator given by \[ \hat{\gamma}_{n}^{H}(k_0,k)=k_0^{-1}\sum_{i=0}^{k_0-1}\log \left( \frac{X_{n-i,n}-X_{n-k,n}}{ X_{n-k_0,n}-X_{n-k,n}}\right), \] where \(k\to\infty\;k_0\to\infty\;,k/n\to0,\;k_0/k\to 0\), and discussed its weak consistency, asymptotic expansion and the optimal choice of the sample fraction \(k_0\).
In this paper, a general estimator for \(\gamma \in\mathbb R\) based on the invariant Hill type estimator and the moment-type estimator is proposed. It is given by \[ \hat{\gamma}_{n}^{M}(k_0,k)= M_{n}^{(1)}(k_0,k) +1-2^{-1}\left\{ 1-\frac{(M_{n}^{(1)}(k_0,k))^2}{M_{n}^{(2)}(k_0,k)} \right\}^{-1}, \]
\[ M_{n}^{(j)}(k_0,k)=k_0^{-1}\sum_{i=0}^{k_0-1}\left(\log \frac{X_{n-i,n}-X_{n-k,n}}{ X_{n-k_0,n}-X_{n-k,n}} \right)^j,\;j=1,2. \] The weak and strong consistency of this new estimator are derived.

MSC:

62G05 Nonparametric estimation
62G32 Statistics of extreme values; tail inference
62G30 Order statistics; empirical distribution functions
62G20 Asymptotic properties of nonparametric inference
PDFBibTeX XMLCite
Full Text: Link