×

zbMATH — the first resource for mathematics

Spectral learning on matrices and tensors. (English) Zbl 1436.68004
The book is primarily a survey of tensor decomposition methods. This topic is dealt with in five of its seven chapters, namely Chapters 3–7, after a short introductory chapter and another short chapter devoted to matrix decomposition. A core part, forming approximately a third of the book is the most comprehensive Chapter 3, devoted to important tensor decomposition algorithms. After introducing several basic concepts such as tensor matricization and two kinds of tensor rank, a plethora of algorithms are described, ranging from rather specific orthogonal decomposition to rather general alternating least squares, and theoretical results concerning them are recalled. Chapter 4 presents several important applications of tensor decomposition to data analysis and modelling, such as independent component analysis, mixtures of axis-aligned Gaussians, latent Dirichlet allocation, or hidden Markov models. In the context of the current popularity of deep learning, it is worth mentioning that the chapter covers also the moment tensor structure of neural networks. Chapters 5 and 6 address, respectively, the availability of implementations of some of the above-mentioned algorithms and their applications, and the time and sample complexity of decomposition algorithms. Finally, Chapter 7 deals with the most challenging problem of tensor decomposition – decomposition of overcomplete tensors.
The text is heavily mathematical, though still needing only knowledge of undergraduate linear algebra because all more advanced concepts are sufficiently explained when introduced. An exception to that mathematical style is Chapter 5, which presents core Python code implementing some of the material presented earlier. With this chapter, the book goes beyond the limits of a purely theoretical monograph.
MSC:
68-02 Research exposition (monographs, survey articles) pertaining to computer science
15A18 Eigenvalues, singular values, and eigenvectors
15A23 Factorization of matrices
15A69 Multilinear algebra, tensor calculus
62H25 Factor analysis and principal components; correspondence analysis
65F99 Numerical linear algebra
68T05 Learning and adaptive systems in artificial intelligence
68T07 Artificial neural networks and deep learning
PDF BibTeX XML Cite
Full Text: DOI