Автор: Shashank Mohan Jain
Издательство: Apress
Год: 2022
Формат: True PDF
Страниц: 169
Размер: 10 Mb
Язык: English
This book covers Transformer architecture and its relevance in natural language processing (NLP). It starts with an introduction to NLP and a progression of language models from n-grams to a Transformer-based architecture. Next, it offers some basic Transformers examples using the Google colab engine. Then, it introduces the Hugging Face ecosystem and the different libraries and models provided by it. Moving forward, it explains language models such as Google BERT with some examples before providing a deep dive into Hugging Face API using different language models to address tasks such as sentence classification, sentiment analysis, summarization, and text generation.