Автор: Yipeng Liu
Издательство: Academic Press, Elsevier
Год: 2022
Страниц: 598
Язык: английский
Формат: pdf (true)
Размер: 17.9 MB
Tensors for Data Processing: Theory, Methods and Applications presents both classical and state-of-the-art methods on tensor computation for data processing, covering computation theories, processing methods, computing and engineering applications, with an emphasis on techniques for data processing. This reference is ideal for students, researchers and industry developers who want to understand and use tensor-based data processing theories and methods.
As a higher-order generalization of a matrix, tensor-based processing can avoid multi-linear data structure loss that occurs in classical matrix-based data processing methods. This move from matrix to tensors is beneficial for many diverse application areas, including signal processing, computer science, acoustics, neuroscience, communication, medical engineering, seismology, psychometric, chemometrics, biometric, quantum physics and quantum chemistry.
- Provides a complete reference on classical and state-of-the-art tensor-based methods for data processing
- Includes a wide range of applications from different disciplines
- Gives guidance for their application
The first chapter is an introduction to tensor decomposition. In the following, the book provides variants of tensor decompositions with their efficient and effective solutions, including some parallel algorithms, Riemannian algorithms, and generalized thresholding algorithms. Some tensor-based machine learning methods are summarized in detail, including tensor completion, tensor principal component analysis, support tensor machine, tensor-based kernel learning, tensor-based deep learning, etc. To demonstrate that tensors can effectively and systematically enhance performance in practical engineering problems, this book gives implemental details of many applications, such as signal recovery, recommender systems, climate forecasting, image clustering, image classification, network compression, data fusion, image enhancement, neuroimaging, and remote sensing.
Tensors for Deep Learning theory. Deep Learning architectures have enabled unprecedented advances in a wide range of Artificial Intelligence-related applications. The empirical success of these architectures has posed fundamental riddles regarding their operation in the front lines of modern theoretical Machine Learning research. Related theoretical efforts can be broadly divided into (i) explaining the observed success of Deep Learning architectures and (ii) harnessing these insights for improving their operation. In the chapter 7, we outline a tensor analysis-based contribution to understanding and improving the expressivity of prominent Deep Learning architecture classes. We detail a successful proof methodology which includes analyzing grid tensors of the functions realized by Deep Learning architecture classes, which was applied for convolutional, recurrent, and self-attention networks.
Скачать Tensors for Data Processing