zbMATH — the first resource for mathematics

A computational theory of surprise. (English) Zbl 1071.94003
Blaum, Mario (ed.) et al., Information, coding and mathematics. Proceedings of workshop honoring Professor Bob McEliece on his 60th birthday, Pasadena, CA, USA, May 24–25, 2002. Boston, MA: Kluwer Academic Publishers (ISBN 1-4020-7079-9/hbk). The Kluwer International Series in Engineering and Computer Science 687, 1-25 (2002).
Summary: While eminently successful for the transmission of data, Shannon’s theory of information does not address semantic and subjective dimensions of data, such as relevance and surprise. We propose an observer-dependent computational theory of surprise where surprise is defined by the relative entropy between the prior and the posterior distributions of an observer. Surprise requires integration over the space of models in contrast with Shannon’s entropy, which requires integration over the space of data. We show how surprise can be computed exactly in a number of discrete and continuous cases using distributions from the exponential family with conjugate priors. We show that during sequential Bayesian learning, surprise decreases like \(1/N\) and study how surprise differs and complements Shannon’s definition of information.
For the entire collection see [Zbl 1054.94001].

94A17 Measures of information, entropy
62B10 Statistical aspects of information-theoretic topics
94A15 Information theory (general)