Автор: Christian Blum
Издательство: Springer
Серия: Computational Intelligence Methods and Applications
Год: 2024
Страниц: 202
Язык: английский
Формат: pdf (true), epub
Размер: 30.7 MB
This book describes a general hybrid metaheuristic for combinatorial optimization labeled Construct, Merge, Solve & Adapt (CMSA). The general idea of standard CMSA is the following one. At each iteration, a number of valid solutions to the tackled problem instance are generated in a probabilistic way. Hereby, each of these solutions is composed of a set of solution components. The components found in the generated solutions are then added to an initially empty sub-instance. Next, an exact solver is applied in order to compute the best solution of the sub-instance, which is then used to update the sub-instance provided as input for the next iteration. In this way, the power of exact solvers can be exploited for solving problem instances much too large for a standalone application of the solver.
Important research lines on CMSA from recent years are covered in this book. After an introductory chapter about standard CMSA, subsequent chapters cover a self-adaptive CMSA variant as well as a variant equipped with a learning component for improving the quality of the generated solutions over time. Furthermore, on outlining the advantages of using set-covering-based integer linear programming models for sub-instance solving, the author shows how to apply CMSA to problems naturally modelled by non-binary integer linear programming models. The book concludes with a chapter on topics such as the development of a problem-agnostic CMSA and the relation between large neighborhood search and CMSA. Combinatorial optimization problems used in the book as test cases include the minimum dominating set problem, the variable-sized bin packing problem, and an electric vehicle routing problem.
First of all, note that all approaches described in this book were implemented in C++ and compiled with one of the latest Gnu compilers. Moreover, as ILP solver we used CPLEX version 22.1, if not otherwise stated. Apart from these fundamental tools for the research described in this book, we made use of the following three tools, which are described in the following. One of the challenges in conducting experimental evaluations of stochastic optimization algorithms is the issue of parameter tuning. This task involves configuring the algorithm’s parameters to optimal values, ensuring its peak performance on the designated set of benchmark instances. The literature offers several scientific tools for this purpose, including SMAC3, ParamILS and HyperBand. In this book, we make use of irace, one of the parameter tuning tools most used for tuning the parameters of optimization algorithms. irace is particularly useful when working with algorithms that have multiple parameters and where finding the right combination of parameter settings can significantly impact their performance. The irace tool automates this process. In summary, irace is a powerful tool for researchers and practitioners who need to find well-working parameter settings for their algorithms efficiently and effectively. It can save a significant amount of time and computational resources by automating the parameter-tuning process and helping users discover the best configuration for their specific tasks.
Visual representations of complex concepts may significantly enhance our ability to grasp digital information more effectively. This principle extends across various domains within Computer Science and AI. Remarkably, the research community in combinatorial optimization has yet to achieve significant progress in the development of visual aids, despite the growing demand for innovative tools that facilitate the comparison of optimization algorithms. Over the past few decades, the conventional approach to comparing optimization algorithms has centered around gathering numerical data from runs of different algorithms. This data is subsequently analyzed using conventional tools, such as tables and classical data charts (e.g., line plots, bar plots, and scatter plots). It has also become customary to complement this form of algorithm comparison with statistical analyses of the collected data. In recent years, an increasing number of researchers have recognized the importance of incorporating supplementary graphical tools to gain a more profound understanding of the behavior of optimization algorithms, particularly metaheuristics. This tool—labeled STNWeb —is a web application based on the concept of so-called Search Trajectory Networks (STNs) metaheuristics.
In addition to the graphical STNWeb tool algorithms are also compared on the basis of numerical results. In order to do this on a scientific basis we make use of statistical comparison, which involves employing statistical methods to assess and differentiate the performance of various optimization techniques. Such a statistical comparison aims to determine whether observed differences in the outcomes are statistically significant or merely due to chance. The goal is to provide a robust and objective basis for selecting the most effective optimization approach in a given context. For this purpose, we make use of the R package scmamp. Several works have outlined a fundamental classification of general Machine Learning scenarios and the corresponding statistical tests suitable for each scenario. The scmamp package implements the recommendations from these works and aims specifically for a comparative analysis of multiple algorithms across multiple problem instances.
The book will be valuable and is intended for researchers, professionals and graduate students working in a wide range of fields, such as combinatorial optimization, algorithmics, metaheuristics, mathematical modeling, evolutionary computing, operations research, Artificial Intelligence, or statistics.
Скачать Construct, Merge, Solve & Adapt: A Hybrid Metaheuristic for Combinatorial Optimization