zbMATH — the first resource for mathematics

Capacity and error exponents of stationary point processes under random additive displacements. (English) Zbl 1318.60055
Summary: Consider a real-valued discrete-time stationary and ergodic stochastic process, called the noise process. For each dimension \(n\), we can choose a stationary point process in \(\mathbb R^{n}\) and a translation invariant tessellation of \(\mathbb R^{n}\). Each point is randomly displaced, with a displacement vector being a section of length \(n\) of the noise process, independent from point to point. The aim is to find a point process and a tessellation that minimizes the probability of decoding error, defined as the probability that the displaced version of the typical point does not belong to the cell of this point. We consider the Shannon regime, in which the dimension \(n\) tends to \(\infty\), while the logarithm of the intensity of the point processes, normalized by dimension, tends to a constant. We first show that this problem exhibits a sharp threshold: if the sum of the asymptotic normalized logarithmic intensity and of the differential entropy rate of the noise process is positive, then the probability of error tends to 1 with \(n\), for all point processes and all tessellations. If it is negative then there exist point processes and tessellations for which this probability tends to 0. The error exponent function, which denotes how quickly the probability of error goes to 0 in \(n\), is then derived using large deviations theory. If the entropy spectrum of the noise satisfies a large deviations principle, then below the threshold the error probability goes exponentially fast to 0 with an exponent that is given in closed form in terms of the rate function of the noise entropy spectrum. This is obtained for two classes of point processes: the Poisson process and a Matérn hard-core point process. New lower bounds on the error exponents are derived from this for Shannon’s additive noise channel in the high signal-to-noise ratio limit that hold for all stationary and ergodic noises with the above properties and that match the best known bounds in the white Gaussian noise case.

60G55 Point processes (e.g., Poisson, Cox, Hawkes processes)
60G10 Stationary stochastic processes
60D05 Geometric probability and stochastic geometry
60F10 Large deviations
94A15 Information theory (general)
Full Text: DOI Euclid
[1] Anantharam, V. and Baccelli, F. (2008). A Palm theory approach to error exponents. In Proc. IEEE Internat. Symp. Inf. Theory , IEEE, pp. 1768-1772.
[2] Anantharam, V. and Baccelli, F. (2010). Information-theoretic capacity and error exponents of stationary point processes under random additive displacements. Preprint. Available at http://uk.arxiv.org/abs/1012.4924v1. · Zbl 1318.60055 · doi:10.1239/aap/1427814578
[3] Ashikhmin, A. E., Barg, A. and Litsyn, S. N. (2000). A new upper bound on the reliability function of the Gaussian channel. IEEE Trans. Inf. Theory 46 , 1945-1961. · Zbl 1003.94047 · doi:10.1109/18.868471
[4] Barron, A. (1985). The strong Ergodic theorem for densities: generalized Shannon-McMillan-Breiman theorem, Ann. Prob. , 13 , 1292-1303. · Zbl 0608.94001 · doi:10.1214/aop/1176992813
[5] Cover, T. M. and Thomas, J. A. (1991). Elements of Information Theory . John Wiley, New York. · Zbl 0762.94001
[6] Daley, D. J. and Vere-Jones, D. (1988). An Introduction to the Theory of Point Processes . Springer, New York. · Zbl 0657.60069
[7] Dembo, A. and Zeitouni, O. (1993). Large Deviations Techniques and Applications . Jones and Bartlett, Boston, MA. · Zbl 0793.60030
[8] El Gamal, A. and Kim, Y.-H. (2011). Network Information Theory . Cambridge University Press. · Zbl 1238.94001 · doi:10.1017/CBO9781139030687
[9] Gallager, R. G. (1968). Information Theory and Reliable Communication . John Wiley, New York. · Zbl 0198.52201
[10] Gray, R. M. (2006). Toeplitz and Circulant Matrices: A Review . NOW Publishers, Delft. · Zbl 1115.15021
[11] Han, T. S. (2003). Information-Spectrum Methods in Information Theory . Springer, Berlin. · Zbl 1010.94001
[12] Kallenberg, O. (1983). Random Measures , 3rd edn. Akademie-Verlag, Berlin. · Zbl 0345.60032
[13] Kieffer, J. C. (1974). A simple proof of the Moy-Perez generalization of the Shannon-McMillan theorem. Pacific J. Math. 51 , 203-206. · Zbl 1137.18008 · doi:10.4007/annals.2007.166.317 · euclid:annm/1200928839 · arxiv:math/0212237
[14] Matérn, B. (1960). Spatial Variation , 2nd edn. Springer, Berlin. · Zbl 0608.62122
[15] Møller, J. (1994). Lectures on Random Voronoĭ Tessellations (Lecture Notes Statist. 87 ). Springer, New York.
[16] Poltyrev, G. (1994). On coding without restrictions for the AWGN channel. IEEE Trans. Inf. Theory 40 , 409-417. · Zbl 0821.94035 · doi:10.1109/18.335935
[17] Shannon, C. E. (1948). A mathematical theory of communication. Bell Sys. Tech. J. 27 , 379-423, 623-656. · Zbl 1154.94303 · doi:10.1002/j.1538-7305.1948.tb01338.x
[18] Shannon, C. E. (1959). Probability of error for optimal codes in a Gaussian channel. Bell Sys. Tech. J. 38 , 611-656. · doi:10.1002/j.1538-7305.1959.tb03905.x
[19] Varadhan, S. R. S. (1984). Large Deviations and Applications . SIAM, Philadelphia, PA. · Zbl 0549.60023
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.