# zbMATH — the first resource for mathematics

Statistical analysis of extreme values. From insurance, finance, hydrology and other fields. With 1 CD-ROM (Windows). 2nd ed. (English) Zbl 1002.62002
Basel: Birkhäuser. xviii, 443 p. (2001).
The asymptotic theory of extremes is a well developed mathematical discipline. It tells us what to expect in extremal behavior and it provides possible models for data evaluation when extremes govern the underlying stochastic laws (examples are high winds, floods, human life spans, and many others). The models look simple because the only possible limit distribution for normalized maxima of independent and identically distributed random variables is a three-parameter family of distributions $$H_c(A,B,x)$$, where $$A$$ and $$B$$ are related to location and scale, and $$c$$ is a shape parameter. The estimation of $$c$$, however, is very sensitive, and there is no current acceptable estimator for it even under ideal circumstances. The most widely used method, known as the threshold estimator, for estimating $$c$$ is based on the top $$k$$ order statistics in a set of $$n$$ observations, where $$n$$ and $$k$$ must be large and $$k/n$$ must be small.
In fact, each of $$1/k$$, $$1/n$$ and $$k/n$$ must be negligible compared with the estimated value of $$c$$.
Since in many real life situations $$c$$ is in the range of -0.05 to 0.05, the requirements on $$n$$ and $$k$$ in the preceding sentence can hardly be met with available data sets. Add to it, that $$A$$ and $$B$$ are estimated form the data, after which the initial observations for further analysis become exchangeable rather than independent. The limiting distribution is no longer the classical limit theorem but a much better approximation is provided by the penultimate limits, first mentioned in the literature by R.A. Fisher and L. Tippett [Proc. Camb. Phil. soc. 24, 180-190 (1928)] and it was fully developed into general results by M.I. Gomes in her Ph.D. Thesis (1978); a large part of which appeared in Ann. Inst. Stat. Math. 36, 71-85 (1984; Zbl 0561.62015).
Yet, in the past 10 to 12 years, a number of books appeared on the statistical analysis of extremes which ignored all these difficulties and spoiled the theory by giving guidance to applied scientists who in turn, when using the guidance, obtained contradictory results. The largest problem is taht if a $$c<0$$ is accepted as the true value then $$(-1/c)$$ provides an upper bound on the random quantity under investigation, which may lead to huge losses to society. For more details, see J. Galambos, Statistics for the 21st Century, 173-187 (2000); and J. Galambos and N. Macri, J. Appl. Stat. Sci. 9, No. 4, 253-263 (2000; Zbl 0962.62108), J. Struct. Eng. 15, No. 7, 792-794 (1999); ibid. 128, 273-274 (2002).
The book under review is careful in regard to the problems raised above. This is the only book known to the reviewer that emphasizes the importance of penultimate behavior. Furthermore, while not criticizing the threshold method, the authors point out that the same set of data may lead to different conclusions from what is given in the book and actual citations are given.
The structure of the book remains the same as in the first edition from 1997, see the review Zbl 0880.62002. It is therefore more than evaluation of extremal data. The authors deal with general problems of parametric models, including processes and time series. The book therefore can be of interest to statisticians who are not necessarily dealing with extremal problems.

##### MSC:
 62-02 Research exposition (monographs, survey articles) pertaining to statistics 62G32 Statistics of extreme values; tail inference 60G70 Extreme value theory; extremal stochastic processes 62G30 Order statistics; empirical distribution functions 62M10 Time series, auto-correlation, regression, etc. in statistics (GARCH) 62P05 Applications of statistics to actuarial sciences and financial mathematics 86A32 Geostatistics
time series
XTREMES