Natural Language Processing with PyTorch

This course covers the use of advanced neural network constructs and architectures, such as recurrent neural networks, word embeddings, and bidirectional RNNs, to solve complex word and language modeling problems using PyTorch. 
Course info
Rating
(18)
Level
Advanced
Updated
Aug 5, 2019
Duration
2h 57m
Table of contents
Course Overview
Implementing Recurrent Neural Networks (RNNs) in PyTorch
Performing Binary Text Classification Using Words
Performing Multi-class Text Classification Using Characters
Performing Sentiment Analysis Using Word Embeddings
Performing Language Translation Using Sequence-to-Sequence Models
Description
Course info
Rating
(18)
Level
Advanced
Updated
Aug 5, 2019
Duration
2h 57m
Your 10-day individual free trial includes:

Expert-led courses

Keep up with the pace of change with thousands of expert-led, in-depth courses.
Description

From chatbots to machine-generated literature, some of the hottest applications of ML and AI these days are for data in textual form. In this course, Natural Language Processing with PyTorch, you will gain the ability to design and implement complex text processing models using PyTorch, which is fast emerging as a popular choice for building deep-learning models owing to its flexibility, ease-of-use, and built-in support for optimized hardware such as GPUs. First, you will learn how to leverage recurrent neural networks (RNNs) to capture sequential relationships within text data. Next, you will discover how to express text using word vector embeddings, a sophisticated form of encoding that is supported by out-of-the-box in PyTorch via the torchtext utility. Finally, you will explore how to build complex multi-level RNNs and bidirectional RNNs to capture both backward and forward relationships within data. You will round out the course by building sequence-to-sequence RNNs for language translation. When you are finished with this course, you will have the skills and knowledge to design and implement complex natural language processing models using sophisticated recurrent neural networks in PyTorch.

About the author
About the author

A problem solver at heart, Janani has a Masters degree from Stanford and worked for 7+ years at Google. She was one of the original engineers on Google Docs and holds 4 patents for its real-time collaborative editing framework.

More from the author
Summarizing Data and Deducing Probabilities
Intermediate
2h 50m
Jul 8, 2021
More courses by Janani Ravi
Section Introduction Transcripts
Section Introduction Transcripts

Course Overview
Hi. My name is Janani Ravi, and welcome to this course on Natural Language Processing with PyTorch. A little about myself, I have a masters degree in electrical engineering from Stanford and have worked at companies such as Microsoft, Google, and Flipkart. At Google, I was one of the first engineers working on drill-type collaborative editing in Google docs, and I hold four patents for its underlying technologies. I currently work on my own startup, Loonycorn, a studio for high quality video content. From chatbot to machine-generated literature, some of the hottest applications of machine learning and artificial intelligence these days auto data in actual form. In this course, you will gain the ability to design and implement complex text processing models using PyTorch. First, you will learn how to leverage recurrent neural networks, or RNNs, to capture sequential relationships within text data. Next, you will discover how to express text using word vector embeddings, a sophisticated form of encoding that is supported out of the box in PyTorch via the torch text utility. Finally, you will explore how to build complex multi-level RNNs and bidirectional RNNs to capture both backward and forward relationships within data. You will round out the course by building sequence-to-sequence RNNs for language translation. When you are finished with this course, you will have the skills and knowledge to design and implement complex natural language processing models using sophisticated recurrent neural networks in PyTorch.