Over 10 mio. titler Fri fragt ved køb over 499,- Hurtig levering 30 dages retur

Getting Started with Google BERT

- First

Bog
  • Format
  • Bog, paperback
  • Engelsk
  • 352 sider

Beskrivelse

Kickstart your NLP journey by exploring BERT and its variants such as ALBERT, RoBERTa, DistilBERT, VideoBERT, and more with Hugging Face's transformers library



Key Features

Explore the encoder and decoder of the transformer modelBecome well-versed with BERT along with ALBERT, RoBERTa, and DistilBERTDiscover how to pre-train and fine-tune BERT models for several NLP tasks

Book Description

BERT (bidirectional encoder representations from transformer) has revolutionized the world of natural language processing (NLP) with promising results. This book is an introductory guide that will help you get to grips with Google's BERT architecture. With a detailed explanation of the transformer architecture, this book will help you understand how the transformer's encoder and decoder work.

You'll explore the BERT architecture by learning how the BERT model is pre-trained and how to use pre-trained BERT for downstream tasks by fine-tuning it for NLP tasks such as sentiment analysis and text summarization with the Hugging Face transformers library. As you advance, you'll learn about different variants of BERT such as ALBERT, RoBERTa, and ELECTRA, and look at SpanBERT, which is used for NLP tasks like question answering. You'll also cover simpler and faster BERT variants based on knowledge distillation such as DistilBERT and TinyBERT. The book takes you through MBERT, XLM, and XLM-R in detail and then introduces you to sentence-BERT, which is used for obtaining sentence representation. Finally, you'll discover domain-specific BERT models such as BioBERT and ClinicalBERT, and discover an interesting variant called VideoBERT.

By the end of this BERT book, you'll be well-versed with using BERT and its variants for performing practical NLP tasks.



What You Will Learn

Understand the transformer model from the ground upFind out how BERT works and pre-train it using masked language model (MLM) and next sentence prediction (NSP) tasksGet hands-on with BERT by learning to generate contextual word and sentence embeddingsFine-tune BERT for downstream tasksGet to grips with ALBERT, RoBERTa, ELECTRA, and SpanBERT modelsGet the hang of the BERT models based on knowledge distillationUnderstand cross-lingual models such as XLM and XLM-RExplore Sentence-BERT, VideoBERT, and BART

Who this book is for

This book is for NLP professionals and data scientists looking to simplify NLP tasks to enable efficient language understanding using BERT. A basic understanding of NLP concepts and deep learning is required to get the best out of this book.

Læs hele beskrivelsen
Detaljer
  • SprogEngelsk
  • Sidetal352
  • Udgivelsesdato22-01-2021
  • ISBN139781838821593
  • Forlag Packt Publishing
  • FormatPaperback
  • Udgave1
Størrelse og vægt
  • Vægt658 g
  • Dybde1,8 cm
  • coffee cup img
    10 cm
    book img
    19,1 cm
    23,5 cm

    Findes i disse kategorier...

    Machine Name: SAXO080