Language Modeling with Recurrent Neural Networks in TensorFlow
If you are working with text data using neural networks, RNNs are a natural choice for sequences. This course works through language modeling problems using RNNS - optical character recognition or OCR and generating text using character prediction.
What you'll learn
Recurrent Neural Networks (RNN) performance and predictive abilities can be improved by using long memory cells such as the LSTM and the GRU cell.
In this course, Language Modeling with Recurrent Neural Networks in Tensorflow, you will learn how RNNs are a natural fit for language modeling because of their inherent ability to store state. RNN performance and predictive abilities can be improved by using long memory cells such as the LSTM and the GRU cell.
First, you will learn how to model OCR as a sequence labeling problem.
Next, you will explore how you can architect an RNN to predict the next character based on past sequences.
Finally, you will focus on understanding advanced functions that the TensorFlow library offers, such as bi-directional RNNs and the multi-RNN cell.
By the end of this course, you will know how to apply and architect RNNs for use-cases such as image recognition, character prediction, and text generation; and you will be comfortable with using TensorFlow libraries for advanced functionality, such as the bidirectional RNN and the multi-RNN cell.
Table of contents
- Version Check 0m
- Module Overview 2m
- Prerequisites and Course Outline 2m
- The Recurrent Neuron 4m
- Training a Recurrent Neural Network 5m
- The Long Memory Cell 5m
- Bidirectional RNNs 7m
- OCR: A Sequence Labelling Problem 4m
- OCR File Format 4m
- Features and Labels for OCR 2m
- Conventional RNN Architecture 5m
- Bidirectional RNN Architecture 3m
- Module Overview 1m
- Running Jupyter Notebook and Import Statements 3m
- Download and Parse OCR File 2m
- Features and Labels 8m
- Shuffle and Feed in Training Data 3m
- Sequence Length Calculations 2m
- Building the RNN 7m
- Training and Evaluating the RNN 6m
- Manually Setup the Bidirectional RNN 6m
- Bidirectional RNN Using the TF Library 3m
- Module Overview 2m
- Using Neural Networks for Natural Language Processing 5m
- Language Modeling Problems 5m
- The Multi-RNN Cell 6m
- Generate Training Data and Labels Using a Sliding Window 5m
- Text Generation Using Character Prediction 2m
- RNN Architecture for Text Prediction 6m
- Understanding Perplexity 6m