Автор: Kenji Yamanishi
Издательство: Springer
Год: 2023
Страниц: 352
Язык: английский
Формат: pdf (true), epub
Размер: 41.3 MB
This book introduces readers to the minimum description length (MDL) principle and its applications in learning. The MDL is a fundamental principle for inductive inference, which is used in many applications including statistical modeling, pattern recognition and Machine Learning. At its core, the MDL is based on the premise that “the shortest code length leads to the best strategy for learning anything from data.” The MDL provides a broad and unifying view of statistical inferences such as estimation, prediction and testing and, of course, Machine Learning.
Specifically, let us consider Machine Learning, which aims at extracting useful knowledge from data to make effective use of it. In real-world Machine Mearning applications, the data sources and models are not sufficiently simple that an elegant theory of statistics cannot straightforwardly be applied. For example, the data source may be non-stationary and may include anomalies. Moreover, the models for knowledge representations may include latent variables, and they may be specified by too many parameters compared to data so that the conventional asymptotic theory is no longer valid. Even for such realistic situations, the MDL principle will guide us toward the best learning strategy in terms of the shortest code-length. My intention in writing this book is to demonstrate the universal effectiveness of the MDL principle to readers through its applications to a variety of learning problems.
This book is addressed to researchers or graduate students who specialize Machine Learning, statistics, information theory, or Computer Science. Hence, readers are assumed to have background knowledge about probability theory, linear algebra, analysis, elementary statistics, basics of Machine Learning, and Computer Science.
Скачать Learning with the Minimum Description Length Principle