×

zbMATH — the first resource for mathematics

Une mesure d’information caractérisant la loi de Poisson. (An information measure characterising the Poisson distribution). (French) Zbl 0621.60028
Sémin. probabilités XXI, Lect. Notes Math. 1247, 563-573 (1987).
[For the entire collection see Zbl 0606.00022.]
After a brief introduction in which ”information theoretic” arguments in probability are reviewed, a discrete analog of Fisher information measure is defined by \[ I(P)=\sum^{\infty}_{k=0}[P(k)-P(k-1)]^ 2/P(k) \] where supp P\(=\{0,1,...\}\) and \(P(-1):=0\). It is shown that I(P) satisfies: \[ I(P)Var(P)\geq 1;\quad I(P*Q)\leq \min \{I(P),I(Q)\};\quad 4I(P*Q)\leq I(P)+I(Q). \] In Section 4, the quantity I(P) is used to establish a criterium for convergence of some discrete measures to the Poisson distribution. This is applied e.g. to show that if \(X_{n,k}\) is a triangle array of row-wise independent \(\{\) 0,1\(\}\)-valued r.v. \(P(X_{nk}=1)=p_{nk}\) and \(\max \{p_{kn}:k\}\to 0\) as \(n\to \infty,\quad \sum_{k}p_{kn}\to \lambda\) then \(\sum_{k}X_{kn}\) has Poison distribution in the limit \(n\to \infty.\)
Additional references to information-theoretic methods besides the very complete references of the paper can be found in I. J. Good’s review-article J. Am. Stat. Assoc. 78, 987-989 (1983), which gives 36 references related to ”maximum entropy formalism”; S. Watanabe, Knowing and guessing, A quantitative study of inference and information (1969; Zbl 0206.209), provides additional insight into meaning of ”measures of information”, and H. J. Landau, Maximum entropy and the moment problem, Bull. Am. Math. Soc., New Ser. 16, 47-77 (1987), is a recent expository paper.
Reviewer: W.Bryc

MSC:
60F05 Central limit and other weak theorems
60E05 Probability distributions: general theory
60E99 Distribution theory
94A15 Information theory (general)
Full Text: Numdam EuDML