Foundations of PyTorch

This course covers many aspects of building deep learning models in PyTorch, including neurons and neural networks, and how PyTorch uses differential calculus to train such models and create dynamic computation graphs in deep learning.
Course info
Level
Beginner
Updated
Apr 1, 2019
Duration
2h 51m
Table of contents
Course Overview
Getting Started with PyTorch for Machine Learning
Working with Gradients Using the Autograd Library
Building Dynamic Computation Graphs
Working PyTorch Tensors
Description
Course info
Level
Beginner
Updated
Apr 1, 2019
Duration
2h 51m
Description

PyTorch is fast emerging as a popular choice for building deep learning models owing to its flexibility, ease-of-use and built-in support for optimized hardware such as GPUs. Using PyTorch, you can build complex deep learning models, while still using Python-native support for debugging and visualization. In this course, Foundations of PyTorch, you will gain the ability to leverage PyTorch support for dynamic computation graphs, and contrast that with other popular frameworks such as TensorFlow. First, you will learn the internals of neurons and neural networks, and see how activation functions, affine transformations, and layers come together inside a deep learning model. Next, you will discover how such a model is trained, that is, how the best values of model parameters are estimated. You will then see how gradient descent optimization is smartly implemented to optimize this process. You will understand the different types of differentiation that could be used in this process, and how PyTorch uses Autograd to implement reverse-mode auto-differentiation. You will work with different PyTorch constructs such as Tensors, Variables, and Gradients. Finally, you will explore how to build dynamic computation graphs in PyTorch. You will round out the course by contrasting this with the approaches used in TensorFlow, another leading deep learning framework which previously offered only static computation graphs, but has recently added support for dynamic computation graphs. When you’re finished with this course, you will have the skills and knowledge to move on to building deep learning models in PyTorch and harness the power of dynamic computation graphs.

About the author
About the author

A problem solver at heart, Janani has a Masters degree from Stanford and worked for 7+ years at Google. She was one of the original engineers on Google Docs and holds 4 patents for its real-time collaborative editing framework.

More from the author
Building Features from Image Data
Advanced
2h 10m
Aug 13, 2019
Designing a Machine Learning Model
Intermediate
3h 25m
Aug 13, 2019
More courses by Janani Ravi
Section Introduction Transcripts
Section Introduction Transcripts

Course Overview
Hi, my name is Janani Ravi, and welcome to this course on the foundations of PyTorch. A little about myself, I have a master's degrees in electrical engineering from Stanford, and have worked at companies such as Microsoft, Google, and Flipkart. At Google, I was one of the first engineers working on real time collaborative editing in Google Doc, and I hold four patents for its underlying technologies. I currently work on my own startup, Loonycorn, a studio for high-quality video content. In this course, you'll gain the ability to leverage PyTorch support for dynamic computation graphs and contrast that with other popular frameworks, such as TensorFlow. First, you learn the internals of neurons and neural networks, and see how activation functions, affine transformations, and layers come together inside a deep learning model. Next, you'll discover how such a model is trained, that is, how the best values of model parameters are estimated. You will then see how gradient descent optimization is smartly implemented to optimize this process. You will understand the different types of differentiation that could be used in this process, and how Python uses Autograd to implement reverse-mode auto-differentiation. You will work with different Python constructs, such as Tensors, Variables, and Gradients. Finally, you'll explore how to build dynamic computation graphs in PyTorch. You will round out the course by contrasting this with approaches used in TensorFlow, another popular deep learning framework, which used to offer only static computation graphs, but has recently also added support for dynamic computation graphs. When you're finished with this course, you'll have the skills and knowledge to move on to building deep models in PyTorch and harnessing the power of dynamic computation graphs.