Advances in Deep Learning, Volume 2

Автор: literator от Сегодня, 15:46, Коментариев: 0

Категория: КНИГИ » ПРОГРАММИРОВАНИЕ

Название: Advances in Deep Learning, Volume 2
Автор: M. Arif Wani, Bisma Sultan, Sarwat Ali, Mukhtar Ahmad Sofi
Издательство: Springer
Год: 2025
Страниц: 199
Язык: английский
Формат: pdf (true), epub
Размер: 32.4 MB

This book describes novel ways of using Deep Learning to solve real-world problems. It covers advanced Deep Learning topics like neural architecture search, ensemble Deep Learning, transfer learning techniques, lightweight architectures, hybrid Deep Learning approaches, and generative adversarial networks. The book discusses the use of these advanced topics in selected applications like image classification, object detection, image steganography, protein secondary structure prediction, and gene expression data classification. Various challenges and future research directions falling under the scope of these topics are discussed.

Steganography, the practice of hiding information within digital media, has long been used for secure communication. Traditional methods of steganography were often susceptible to detection or offered limited capacity for hidden data. However, with the advent of deep learning, new techniques have emerged that significantly improve the robustness, capacity, and security of hidden information.

- Highlights novel ways of using advanced Deep Learning topics to solve real-world problems
- Offers insights into Deep Learning architectures and algorithms
- Elaborates on both basic and advanced concepts in Deep Learning

The book is organized into thirteen chapters:

Chapter 1 discusses the impact of deep learning in three important areas: Neural Architecture Search (NAS), Steganography, and Medical Applications. The chapter introduces NAS, an approach that automates the neural network architecture search process, resulting in the design of efficient and high-performance models. It then discusses steganography, where impact of advanced methods based on deep learning for secure data embedding within digital media is outlined. The chapter highlights the achievements of deep learning in the medical field. The chapter also outlines the future research directions in these three important areas.

Chapter 2 explores the fundamental concepts of Evolutionary Algorithm-Based Neural Architecture Search (NAS). By employing principles of natural selection, evolutionary NAS iteratively evolves optimal neural architectures across generations, effectively exploring a vast search space of architectures. The integration of techniques such as mutation, crossover, and selection enhances the diversity and adaptability of architectures for complex tasks like image classification. The chapter also evaluates the performance of various evolutionary NAS methods, comparing their effectiveness and identifying promising avenues for future research and development.

Chapter 3 focuses on Gradient-Based Neural Architecture Search, an approach that automates the design of neural network architectures. It examines the DARTS methodology in-depth, a foundational gradient-based NAS technique that formulates architecture search as a continuous optimization problem. An experimental analysis showcases the efficiency and effectiveness of gradient-based NAS, highlighting its practical applications. The chapter concludes with a discussion on future directions for research, underscoring the importance of balancing accuracy, efficiency, and computational costs in advancing this powerful architecture search paradigm.

Chapter 4 presents a new training methodology aimed at improving the performance of deep learning models. The approach utilizes a coarse-to-fine-tuning strategy that incorporates selective freezing techniques, specifically Simple Selective Freezing (SSF) and Progression-Based Selective Freezing (PSF). Initially, coarse training is performed on deep learning architectures, followed by the application of these selective freezing methods to fine-tune the model. This approach can be applied to architectures obtained either manually or through Neural Architecture Search (NAS) methods. The experiments on the CIFAR-10 dataset, using an architecture derived from DARTS, reveal that the coarse-to-fine-tuning approach outperforms traditional training methods.

Chapter 5 discusses various Generative Adversarial Networks (GANs) architectures that have been used in image steganography. Generative Adversarial Networks have gained considerable attention in image steganography mainly because these networks can encode and decode secret information using digital images efficiently. Various GAN-based techniques to embed and extract secret data seamlessly using images, offering a robust solution for secure communication and data concealment, are discussed...

Скачать Advances in Deep Learning, Volume 2




ОТСУТСТВУЕТ ССЫЛКА/ НЕ РАБОЧАЯ ССЫЛКА ЕСТЬ РЕШЕНИЕ, ПИШИМ СЮДА!


Нашел ошибку? Есть жалоба? Жми!
Пожаловаться администрации
Уважаемый посетитель, Вы зашли на сайт как незарегистрированный пользователь.
Мы рекомендуем Вам зарегистрироваться либо войти на сайт под своим именем.
Информация
Посетители, находящиеся в группе Гости, не могут оставлять комментарии к данной публикации.