×

Mutual dependence of random variables and maximum discretized entropy. (English) Zbl 0563.60023

In connection with a random vector (X,Y) in the unit square Q and a couple (m,n) of positive integers, we consider all discretizations of the continuous probability distribution of (X,Y) that are obtained by an \(m\times n\) Cartesian decomposition of Q. We prove that Y is a (continuous and invertible) function of X if and only if for each m,n the maximum entropy of the finite distributions equals \(\log (m+n-1)\).

MSC:

60E99 Distribution theory
62-07 Data analysis (statistics) (MSC2010)
62B10 Statistical aspects of information-theoretic topics
PDF BibTeX XML Cite
Full Text: DOI