Natural Language Processing: NLP With Transformers in Python
Learn next-generation NLP with transformers for sentiment analysis, Q&A, similarity search, NER, and more
What you’ll learn
-
Industry standard NLP using transformer models
-
Build full-stack question-answering transformer models
-
Perform sentiment analysis with transformers models in PyTorch and TensorFlow
-
Advanced search technologies like Elasticsearch and Facebook AI Similarity Search (FAISS)
-
Create fine-tuned transformers models for specialized use-cases
-
Measure performance of language models using advanced metrics like ROUGE
-
Vector building techniques like BM25 or dense passage retrievers (DPR)
-
An overview of recent developments in NLP
-
Understand attention and other key components of transformers
-
Learn about key transformers models such as BERT
-
Preprocess text data for NLP
-
Named entity recognition (NER) using spaCy and transformers
-
Fine-tune language classification models
Requirements
-
Knowledge of Python
-
Experience in data science a plus
-
Experience in NLP a plus
Description
Transformer models are the de-facto standard in modern NLP. They have proven themselves as the most expressive, powerful models for language by a large margin, beating all major language-based benchmarks time and time again.
In this course, we learn all you need to know to get started with building cutting-edge performance NLP applications using transformer models like Google AI’s BERT, or Facebook AI’s DPR.
We cover several key NLP frameworks including:
- HuggingFace’s Transformers
- TensorFlow 2
- PyTorch
- spaCy
- NLTK
- Flair
And learn how to apply transformers to some of the most popular NLP use-cases:
- Language classification/sentiment analysis
- Named entity recognition (NER)
- Question and Answering
- Similarity/comparative learning
Throughout each of these use-cases we work through a variety of examples to ensure that what, how, and why transformers are so important. Alongside these sections we also work through two full-size NLP projects, one for sentiment analysis of financial Reddit data, and another covering a fully-fledged open domain question-answering application.
All of this is supported by several other sections that encourage us to learn how to better design, implement, and measure the performance of our models, such as:
- History of NLP and where transformers come from
- Common preprocessing techniques for NLP
- The theory behind transformers
- How to fine-tune transformers
We cover all this and more, I look forward to seeing you in the course!
Who this course is for:
- Aspiring data scientists and ML engineers interested in NLP
- Practitioners looking to upgrade their skills
- Developers looking to implement NLP solutions
- Data scientist
- Machine Learning Engineer
- Python Developers
Size: 3.61 GB
https://www.udemy.com/course/nlp-with-transformers/.
cant access this link.