Автор: Tony Pourmohamad, Herbert K. H. Lee
Издательство: Springer
Год: 2021
Страниц: 113
Язык: английский
Формат: pdf (true), epub
Размер: 26.5 MB
This book introduces readers to Bayesian optimization, highlighting advances in the field and showcasing its successful applications to computer experiments. R code is available as online supplementary material for most included examples, so that readers can better comprehend and reproduce methods.
Compact and accessible, the volume is broken down into four chapters. Chapter 1 introduces the reader to the topic of computer experiments; it includes a variety of examples across many industries. Chapter 2 focuses on the task of surrogate model building and contains a mix of several different surrogate models that are used in the computer modeling and Machine Learning (ML) communities. Chapter 3 introduces the core concepts of Bayesian optimization and discusses unconstrained optimization. Chapter 4 moves on to constrained optimization, and showcases some of the most novel methods found in the field.
Optimization is a field with a long history and much past research. Our focus here is on a very specific sub-area, that of derivative-free optimization of expensive black-box functions. While much of the optimization world involves efficient algorithms that make use of gradient information, we operate in an environment where gradient information is typically not available. The application area we are most familiar with is that of deterministic computer simulation experiments, where complex computer code attempts to model a real-world phenomenon. The code can be run for any inputs, but it only returns output values, and does not provide any gradient information for those outputs. Also, running the code for the same input will always return the same output. The code can be expensive to run, requiring significant computing power and time. Thus, there is a need for efficient optimization routines that do not rely on information about derivatives of the function. That is the focus of this book.
Moreover, there are few books that are devoted to either the topic of Bayesian optimization or computer experiments, and even fewer devoted solely to both. The term “Bayesian optimization” comes from the Machine Learning literature and has become a very hot topic among the Machine Learning community due to its highly successful application in the optimization of tuning parameters for Machine Learning models. Much work has also been done in the realms of applied math, statistics, and engineering, sometimes under different names. The mixing of these disciplines has led to useful insights and new research directions. And although the roots of Bayesian optimization can be traced back to early optimization problems in computer experiments, the application of Bayesian optimization is still in its nascent phase within the field of computer experiments as compared to in Machine Learning.
We provide R code as supplementary material for most examples in the book so that the methods mentioned in the book are understandable, reproducible, and transparent. The R code can be found at Github. By no means is R the only software language that exists for conducting Bayesian optimization. In fact, a significant amount of effort for developing Bayesian optimization software has been undertaken by the Python programming community. For example, the Python BoTorch software (Balandat et al. 2020) provides users with an out-of-the-box software for implementing Bayesian optimization with minimal effort. Lastly, we note that all calculations and analyses in the book were carried out using R version 4.0.5 (2021-03-31)-"Shake and Throw".
This will be a useful companion to researchers and practitioners working with computer experiments and computer modeling. Additionally, readers with a background in Machine Learning but minimal background in computer experiments will find this book an interesting case study of the applicability of Bayesian optimization outside the realm of Machine Learning.
Скачать Bayesian Optimization with Application to Computer Experiments