×

Blind source separation using Renyi’s \(\alpha\)-marginal entropies. (English) Zbl 1047.68097

Summary: We have recently suggested the minimization of a nonparametric estimator of Renyi’s mutual information as a criterion for blind source separation. Using a two-stage topology, consisting of spatial whitening and a series of Givens rotations, the cost function reduces to the sum of marginal entropies, just like in the Shannon’s entropy case. Since we use a Parzen window density estimator and eliminate the joint entropy by employing an orthonormal demixing matrix, the problems of probability density function inaccuracy due to truncation of series expansion and the estimation of joint pdfs in high-dimensional spaces (given the typical paucity of data) are avoided, respectively. In our previous formulation, the algorithm was restricted to Renyi’s second-order entropy and Gaussian kernels for the Parzen window estimator. The present work extends the previous results by formulating a new estimation methodology for Renyi’s entropy, which allows the designer to choose any order of entropy and any suitable kernel function. Simulations illustrate that the proposed method compares favorably to Hyvarinen’s FastICA, Bell and Sejnowski’s Infomax and Common’s minimum of mutual information.

MSC:

68T05 Learning and adaptive systems in artificial intelligence
68T10 Pattern recognition, speech recognition
94A12 Signal theory (characterization, reconstruction, filtering, etc.)
PDFBibTeX XMLCite
Full Text: DOI