Hamiltonian Monte Carlo Methods in Machine Learning

Автор: literator от 11-03-2023, 04:21, Коментариев: 0

Категория: КНИГИ » ПРОГРАММИРОВАНИЕ

Hamiltonian Monte Carlo Methods in Machine LearningНазвание: Hamiltonian Monte Carlo Methods in Machine Learning
Автор: Tshilidzi Marwala, Wilson Tsakane Mongwe
Издательство: Academic Press/Elsevier
Год: 2023
Страниц: 222
Язык: английский
Формат: pdf (true), epub (true)
Размер: 39.9 MB

Hamiltonian Monte Carlo Methods in Machine Learning introduces methods for optimal tuning of HMC parameters, along with an introduction of Shadow and Non-canonical HMC methods with improvements and speedup. Lastly, the authors address the critical issues of variance reduction for parameter estimates of numerous HMC based samplers. The book offers a comprehensive introduction to Hamiltonian Monte Carlo methods and provides a cutting-edge exposition of the current pathologies of HMC-based methods in both tuning, scaling and sampling complex real-world posteriors. These are mainly in the scaling of inference (e.g., Deep Neural Networks), tuning of performance-sensitive sampling parameters and high sample autocorrelation.

Other sections provide numerous solutions to potential pitfalls, presenting advanced HMC methods with applications in renewable energy, finance and image classification for biomedical applications. Readers will get acquainted with both HMC sampling theory and algorithm implementation.

The 4th Industrial Revolution is evolving with its great technological and social impacts. It is fuelled by rapid advances in Internets, wireless device and communication, big data, cloud computing, artificial intelligence (AI) and robotics. Machine learning (ML) is a core branch of AI. It produces useful models from data, which lead to wide applications such as speech recognition, medical diagnosis and autonomous driving. One fundamental task in ML as well as optimisation and statistics is to sample from a probability distribution. These samples can be used to infer models such as deep neural networks, to ensure robustness to optimisation methods by allowing them to escape local minima/saddle points, and to prevent them from overfilling.

Sampling methods are often the key to solve integration, counting, and volume computation problems that are rampant in various applications at the intersection of sciences and ML. In modern programming languages there are functions to sample from simple distributions such as uniform, normal and Poisson. But for a general or Gibbs distribution, Markov chain Monte Carlo (MCMC) methods are commonly employed in Bayesian machine learning, where a Markov chain is constructed as a sample of the desired distribution. There are two basic MCMC techniques. In Gibbs sampling, typically one parameter is drawn from the distribution at a time, holding all others fixed. In the Metropolis algorithm, all the parameters can be varied at once. The parameter vector is perturbed from the current sequence point by adding a trial step and the trial position is either accepted or rejected on the basis of the probability at the trial position relative to the current one. However, there are the problems of low acceptance rates of proposals for Metropolis techniques and the low performance of the Gibbs algorithm in multidimensional problems.

This led to the emergence of a new MCMC technique using Hamiltonian dynamics, that is, Hamiltonian Monte Carlo – HMC. The HMC is an adaptation of the Metropolis technique and employs a guided scheme for generating new proposals: this improves the proposals acceptance rate and, consequently, efficiency. More specifically, the HMC uses the posterior log gradient to direct the Markov chain to regions of higher posterior density, where most samples are collected. As a result, a Markov chain with HMC algorithm will accept proposals at a much higher rate than the traditional Metropolis algorithm. HMC was first discovered by physicists and was adopted with much success in ML. It is currently the main algorithm of MCMC methods used in practice.

Скачать Hamiltonian Monte Carlo Methods in Machine Learning








Нашел ошибку? Есть жалоба? Жми!
Пожаловаться администрации
Уважаемый посетитель, Вы зашли на сайт как незарегистрированный пользователь.
Мы рекомендуем Вам зарегистрироваться либо войти на сайт под своим именем.
Информация
Посетители, находящиеся в группе Гости, не могут оставлять комментарии к данной публикации.