Representation in Machine Learning

Автор: literator от 22-01-2023, 05:30, Коментариев: 0

Категория: КНИГИ » ПРОГРАММИРОВАНИЕ

Representation in Machine LearningНазвание: Representation in Machine Learning
Автор: M.N. Murty, M. Avinash
Издательство: Springer
Год: 2023
Страниц: 101
Язык: английский
Формат: pdf (true), epub
Размер: 26.9 MB

This book provides a concise but comprehensive guide to representation, which forms the core of Machine Learning (ML). State-of-the-art practical applications involve a number of challenges for the analysis of high-dimensional data. Unfortunately, many popular ML algorithms fail to perform, in both theory and practice, when they are confronted with the huge size of the underlying data. Solutions to this problem are aptly covered in the book.

In addition, the book covers a wide range of representation techniques that are important for academics and ML practitioners alike, such as Locality Sensitive Hashing (LSH), Distance Metrics and Fractional Norms, Principal Components (PCs), Random Projections and Autoencoders. Several experimental results are provided in the book to demonstrate the discussed techniques’ effectiveness.

In practical applications of current interest, the data typically is high dimensional. These applications include image classification, information retrieval, problem solving in AI, biological and chemical structure analysis, and social network analysis. A major problem with such high-dimensional data analysis is that most of the popular tools like the k-nearest neighbor classifier, decision tree classifier, and several clustering algorithms that depend on interpattern distance computations fail to work well. So, representing the data in a lower-dimensional space is inevitable.

This book is organized as follows: Chapter 1 deals with a generic introduction to Machine Learning and various concepts including feature engineering, model selection, model estimation, model validation, and model explanation. Two important tasks in ML are classification and clustering. So, Chap. 2 deals with the representation of data items, classes, and clusters.

Nearest neighbor finding algorithms play an important role in several ML tasks. However, finding nearest neighbors in high-dimensional spaces can be both time consuming and inaccurate. In Chap. 3, we deal with nearest neighbor finding algorithms using fractional norms and approximate nearest neighbor computation using locality-sensitive hashing. We illustrate using several benchmark data sets.

Chapter 4 deals with feature selection and linear feature extraction schemes. It includes discussion on principal components, random projections, and nonnegative matrix factorization. Nonlinear feature extraction schemes are gaining importance because of the Deep Learning architectures based on autoencoders and multilayer perceptrons. These topics are examined in Chap. 5.

Audience:
The coverage is meant for both students and teachers and helps practitioners in implementing ML algorithms. It is intended for senior undergraduate and graduate students and researchers working in Machine Learning, data mining, and pattern recognition. We present material in this book so that it is accessible to a wide variety of readers with some basic exposure to undergraduate-level mathematics. The presentation is intentionally made simpler to make the reader feel comfortable.

Скачать Representation in Machine Learning








Нашел ошибку? Есть жалоба? Жми!
Пожаловаться администрации
Уважаемый посетитель, Вы зашли на сайт как незарегистрированный пользователь.
Мы рекомендуем Вам зарегистрироваться либо войти на сайт под своим именем.
Информация
Посетители, находящиеся в группе Гости, не могут оставлять комментарии к данной публикации.