Back to courses

PyTorch for Deep Learning in 2024

Learn PyTorch from scratch! This PyTorch course is your step-by-step guide to developing your own deep learning models using PyTorch. You'll learn Deep Learning with PyTorch by building a massive 3-part real-world milestone project. By the end, you'll have the skills and portfolio to get hired as a Deep Learning Engineer.

Learn PyTorch. Become a Deep Learning Engineer. Get Hired.

50 Days

Average time students take to complete this course.

instructor
Taught by: Daniel Bourke
Last updated: March 2024

Course overview

We can guarantee (with, like, 99.57% confidence) that this is the most comprehensive, modern, and up-to-date course you will find to learn PyTorch and the cutting-edge field of Deep Learning. Daniel takes you step-by-step from an absolute beginner to becoming a master of Deep Learning with PyTorch.

What you'll learn

  • Everything from getting started with using PyTorch to building your own real-world models
  • Why PyTorch is a fantastic way to start working in machine learning
  • Understand how to integrate Deep Learning into tools and applications
  • Create and utilize machine learning algorithms just like you would write a Python program
  • Build and deploy your own custom trained PyTorch neural network accessible to the public
  • How to take data, build a ML algorithm to find patterns, and then use that algorithm as an AI to enhance your applications
  • Master deep learning and become a top candidate for recruiters seeking Deep Learning Engineers
  • To expand your Machine Learning and Deep Learning skills and toolkit
  • The skills you need to become a Deep Learning Engineer and get hired with a chance of making US$100,000+ / year

What is PyTorch and why should I learn it?

PyTorch is a machine learning and deep learning framework written in Python.

PyTorch enables you to craft new and use existing state-of-the-art deep learning algorithms like neural networks powering much of today’s Artificial Intelligence (AI) applications.

Plus it's so hot right now, so there's lots of jobs available!

PyTorch is used by companies like:

  • Tesla to build the computer vision systems for their self-driving cars
  • Meta to power the curation and understanding systems for their content timelines
  • Apple to create computationally enhanced photography.

Want to know what's even cooler?

Much of the latest machine learning research is done and published using PyTorch code so knowing how it works means you’ll be at the cutting edge of this highly in-demand field.

And you'll be learning PyTorch in good company.

Graduates of Zero To Mastery are now working at Google, Tesla, Amazon, Apple, IBM, Uber, Meta, Shopify + other top tech companies at the forefront of machine learning and deep learning.

This can be you.

By enrolling today, you’ll also get to join our exclusive live online community classroom to learn alongside thousands of students, alumni, mentors, TAs and Instructors.

Most importantly, you will be learning PyTorch from a professional machine learning engineer, with real-world experience, and who is one of the best teachers around!

What will this PyTorch course be like?

This PyTorch course is very hands-on and project based. You won't just be staring at your screen. We'll leave that for other PyTorch tutorials and courses.

In this course you'll actually be:

  • Running experiments
  • Completing exercises to test your skills
  • Building real-world deep learning models and projects to mimic real life scenarios

By the end of it all, you'll have the skillset needed to identify and develop modern deep learning solutions that Big Tech companies encounter.

⚠ Fair warning: this course is very comprehensive. But don't be intimidated, Daniel will teach you everything from scratch and step-by-step!

Here's what you'll learn in this PyTorch course:

1. PyTorch Fundamentals — We start with the barebone fundamentals, so even if you're a beginner you'll get up to speed.

In machine learning, data gets represented as a tensor (a collection of numbers). Learning how to craft tensors with PyTorch is paramount to building machine learning algorithms. In PyTorch Fundamentals we cover the PyTorch tensor datatype in-depth.

2. PyTorch Workflow — Okay, you’ve got the fundamentals down, and you've made some tensors to represent data, but what now?

With PyTorch Workflow you’ll learn the steps to go from data -> tensors -> trained neural network model. You’ll see and use these steps wherever you encounter PyTorch code as well as for the rest of the course.

3. PyTorch Neural Network Classification — Classification is one of the most common machine learning problems.

  • Is something one thing or another?
  • Is an email spam or not spam?
  • Is credit card transaction fraud or not fraud?

With PyTorch Neural Network Classification you’ll learn how to code a neural network classification model using PyTorch so that you can classify things and answer these questions.

4. PyTorch Computer Vision — Neural networks have changed the game of computer vision forever. And now PyTorch drives many of the latest advancements in computer vision algorithms.

For example, Tesla use PyTorch to build the computer vision algorithms for their self-driving software.

With PyTorch Computer Vision you’ll build a PyTorch neural network capable of seeing patterns in images of and classifying them into different categories.

5. PyTorch Custom Datasets — The magic of machine learning is building algorithms to find patterns in your own custom data. There are plenty of existing datasets out there, but how do you load your own custom dataset into PyTorch?

This is exactly what you'll learn with the PyTorch Custom Datasets section of this course.

You’ll learn how to load an image dataset for FoodVision Mini: a PyTorch computer vision model capable of classifying images of pizza, steak and sushi (am I making you hungry to learn yet?!).

We’ll be building upon FoodVision Mini for the rest of the course.

6. PyTorch Going Modular — The whole point of PyTorch is to be able to write Pythonic machine learning code.

There are two main tools for writing machine learning code with Python:

  1. A Jupyter/Google Colab notebook (great for experimenting)
  2. Python scripts (great for reproducibility and modularity)

In the PyTorch Going Modular section of this course, you’ll learn how to take your most useful Jupyter/Google Colab Notebook code and turn it reusable Python scripts. This is often how you’ll find PyTorch code shared in the wild.

7. PyTorch Transfer Learning — What if you could take what one model has learned and leverage it for your own problems? That’s what PyTorch Transfer Learning covers.

You’ll learn about the power of transfer learning and how it enables you to take a machine learning model trained on millions of images, modify it slightly, and enhance the performance of FoodVision Mini, saving you time and resources.

8. PyTorch Experiment Tracking — Now we're going to start cooking with heat by starting Part 1 of our Milestone Project of the course!

At this point you’ll have built plenty of PyTorch models. But how do you keep track of which model performs the best?

That’s where PyTorch Experiment Tracking comes in.

Following the machine learning practitioner’s motto of experiment, experiment, experiment! you’ll setup a system to keep track of various FoodVision Mini experiment results and then compare them to find the best.

9. PyTorch Paper Replicating — The field of machine learning advances quickly. New research papers get published every day. Being able to read and understand these papers takes time and practice.

So that’s what PyTorch Paper Replicating covers. You’ll learn how to go through a machine learning research paper and replicate it with PyTorch code.

At this point you'll also undertake Part 2 of our Milestone Project, where you’ll replicate the groundbreaking Vision Transformer architecture!

10. PyTorch Model Deployment — By this stage your FoodVision model will be performing quite well. But up until now, you’ve been the only one with access to it.

How do you get your PyTorch models in the hands of others?

That’s what PyTorch Model Deployment covers. In Part 3 of your Milestone Project, you’ll learn how to take the best performing FoodVision Mini model and deploy it to the web so other people can access it and try it out with their own food images.

What's the bottom line?

Machine learning's growth and adoption is exploding, and deep learning is how you take your machine learning knowledge to the next level. More and more job openings are looking for this specialized knowledge.

Companies like Tesla, Microsoft, OpenAI, Meta (Facebook + Instagram), Airbnb and many others are currently powered by PyTorch.

And this is the most comprehensive online bootcamp to learn PyTorch and kickstart your career as a Deep Learning Engineer.

So why wait? Advance your career and earn a higher salary by mastering PyTorch and adding deep learning to your toolkit 💪.

And you have nothing to lose. Because you can start learning right now and if this course isn't everything you expected, we'll refund you 100% within 30 days. No hassles and no questions asked.

When's the best time to get started? Today!

There's never a bad time to learn in-demand skills. But the sooner, the better. So start learning PyTorch today by joining the ZTM Academy. You'll have a clear roadmap to developing the skills to build your own projects, get hired, and advance your career.

Join Zero To Mastery Now

What you'll build

The best way you learn is by doing. Not just watching endless tutorials. That's why a key part of this course is the real-world projects that you'll get to build. Plus they'll look great on your portfolio.

[Part 1] Milestone Project - FoodVision Experiment Tracking

[Part 1] Milestone Project - FoodVision Experiment Tracking

Throughout the course you'll build a deep learning computer vision model called FoodVision to identify over 100 different foods. For Part 1, you’ll code and track different modeling experiments and determine the best performer.

[Part 2] Milestone Project - FoodVision Paper Replicating

[Part 2] Milestone Project - FoodVision Paper Replicating

For Part 2, you’ll take a machine learning research paper related to FoodVision and from the Google Brain research team, and replicate it with PyTorch code so that you'll be able to replicate all the research you'll come across in your career.

[Part 3] Milestone Project - FoodVision Model Deployment

[Part 3] Milestone Project - FoodVision Model Deployment

In Part 3, you'll take your FoodVision model and actually deploy it to the internet so that anyone from around the world can use your model with their own images of food, and so that you can show off your model to potential employers!

Join Zero To Mastery Now

Course curriculum

To make sure this course is a good fit for you, you can start learning PyTorch for free right now by clicking any of the PREVIEW links below.

Introduction

7 lectures

PyTorch for Deep Learning3:33

PREVIEW

Course Welcome and What Is Deep Learning5:53

PREVIEW

Exercise: Meet Your Classmates and Instructor

PREVIEW

Course Companion Book + Code + More

PREVIEW

Machine Learning + Python Monthly

PREVIEW

Understanding Your Video Player (notes, video speed, subtitles + more)

PREVIEW

Set Your Learning Streak Goal

PREVIEW

Section 00: PyTorch Fundamentals

33 lectures

Why Use Machine Learning or Deep Learning3:33

PREVIEW

The Number 1 Rule of Machine Learning and What Is Deep Learning Good For5:39

PREVIEW

Machine Learning vs. Deep Learning6:06

PREVIEW

Anatomy of Neural Networks9:21

PREVIEW

Different Types of Learning Paradigms4:30

PREVIEW

What Can Deep Learning Be Used For6:21

PREVIEW

What Is and Why PyTorch10:12

PREVIEW

What Are Tensors4:15

PREVIEW

What We Are Going To Cover With PyTorch6:05

BEGIN

How To and How Not To Approach This Course5:09

BEGIN

Important Resources For This Course5:21

BEGIN

Getting Setup to Write PyTorch Code7:39

BEGIN

Introduction to PyTorch Tensors13:24

BEGIN

Creating Random Tensors in PyTorch9:58

BEGIN

Creating Tensors With Zeros and Ones in PyTorch3:08

BEGIN

Creating a Tensor Range and Tensors Like Other Tensors5:17

BEGIN

Dealing With Tensor Data Types9:24

BEGIN

Getting Tensor Attributes8:22

BEGIN

Manipulating Tensors (Tensor Operations)5:59

BEGIN

Matrix Multiplication (Part 1)9:34

BEGIN

Matrix Multiplication (Part 2): The Two Main Rules of Matrix Multiplication7:51

BEGIN

Matrix Multiplication (Part 3): Dealing With Tensor Shape Errors12:56

BEGIN

Finding the Min Max Mean and Sum of Tensors (Tensor Aggregation)6:09

BEGIN

Finding The Positional Min and Max of Tensors3:16

BEGIN

Reshaping, Viewing and Stacking Tensors13:40

BEGIN

Squeezing, Unsqueezing and Permuting Tensors11:55

BEGIN

Selecting Data From Tensors (Indexing)9:31

BEGIN

PyTorch Tensors and NumPy9:08

BEGIN

PyTorch Reproducibility (Taking the Random Out of Random)10:46

BEGIN

Different Ways of Accessing a GPU in PyTorch11:50

BEGIN

Setting up Device Agnostic Code and Putting Tensors On and Off the GPU7:43

BEGIN

PyTorch Fundamentals: Exercises and Extra-Curriculum4:49

BEGIN

Let's Have Some Fun (+ Free Resources)

BEGIN

Section 01: PyTorch Workflow

28 lectures

Introduction and Where You Can Get Help2:45

BEGIN

Getting Setup and What We Are Covering7:14

BEGIN

Creating a Simple Dataset Using the Linear Regression Formula9:40

BEGIN

Splitting Our Data Into Training and Test Sets8:19

BEGIN

Building a function to Visualize Our Data7:45

BEGIN

Creating Our First PyTorch Model for Linear Regression14:09

BEGIN

Breaking Down What's Happening in Our PyTorch Linear regression Model6:10

BEGIN

Discussing Some of the Most Important PyTorch Model Building Classes6:26

BEGIN

Checking Out the Internals of Our PyTorch Model9:50

BEGIN

Making Predictions With Our Random Model Using Inference Mode11:12

BEGIN

Training a Model Intuition (The Things We Need)8:14

BEGIN

Setting Up an Optimizer and a Loss Function12:51

BEGIN

PyTorch Training Loop Steps and Intuition13:53

BEGIN

Writing Code for a PyTorch Training Loop8:46

BEGIN

Reviewing the Steps in a Training Loop Step by Step14:57

BEGIN

Running Our Training Loop Epoch by Epoch and Seeing What Happens9:25

BEGIN

Writing Testing Loop Code and Discussing What's Happening Step by Step11:37

BEGIN

Reviewing What Happens in a Testing Loop Step by Step14:42

BEGIN

Writing Code to Save a PyTorch Model13:45

BEGIN

Writing Code to Load a PyTorch Model8:44

BEGIN

Setting Up to Practice Everything We Have Done Using Device-Agnostic Code6:02

BEGIN

Putting Everything Together (Part 1): Data6:07

BEGIN

Putting Everything Together (Part 2): Building a Model10:07

BEGIN

Putting Everything Together (Part 3): Training a Model12:39

BEGIN

Putting Everything Together (Part 4): Making Predictions With a Trained Model5:17

BEGIN

Putting Everything Together (Part 5): Saving and Loading a Trained Model9:10

BEGIN

PyTorch Workflow: Exercises and Extra-Curriculum3:57

BEGIN

Unlimited Updates

BEGIN

Section 02: PyTorch Neural Network Classification

33 lectures

Introduction to Machine Learning Classification With PyTorch9:41

BEGIN

Classification Problem Example: Input and Output Shapes9:06

BEGIN

Typical Architecture of a Classification Neural Network (Overview)6:30

BEGIN

Making a Toy Classification Dataset12:18

BEGIN

Turning Our Data into Tensors and Making a Training and Test Split11:55

BEGIN

Laying Out Steps for Modelling and Setting Up Device-Agnostic Code4:19

BEGIN

Coding a Small Neural Network to Handle Our Classification Data10:57

BEGIN

Making Our Neural Network Visual6:57

BEGIN

Recreating and Exploring the Insides of Our Model Using nn.Sequential13:17

BEGIN

Setting Up a Loss Function Optimizer and Evaluation Function for Our Classification Network14:50

BEGIN

Going from Model Logits to Prediction Probabilities to Prediction Labels16:06

BEGIN

Coding a Training and Testing Optimization Loop for Our Classification Model15:26

BEGIN

Writing Code to Download a Helper Function to Visualize Our Models Predictions14:13

BEGIN

Discussing Options to Improve a Model8:02

BEGIN

Creating a New Model with More Layers and Hidden Units9:06

BEGIN

Writing Training and Testing Code to See if Our New and Upgraded Model Performs Better12:45

BEGIN

Creating a Straight Line Dataset to See if Our Model is Learning Anything8:07

BEGIN

Building and Training a Model to Fit on Straight Line Data10:01

BEGIN

Evaluating Our Models Predictions on Straight Line Data5:23

BEGIN

Introducing the Missing Piece for Our Classification Model Non-Linearity10:00

BEGIN

Building Our First Neural Network with Non-Linearity10:25

BEGIN

Writing Training and Testing Code for Our First Non-Linear Model15:12

BEGIN

Making Predictions with and Evaluating Our First Non-Linear Model5:47

BEGIN

Replicating Non-Linear Activation Functions with Pure PyTorch9:34

BEGIN

Putting It All Together (Part 1): Building a Multiclass Dataset11:24

BEGIN

Creating a Multi-Class Classification Model with PyTorch12:27

BEGIN

Setting Up a Loss Function and Optimizer for Our Multi-Class Model6:39

BEGIN

Going from Logits to Prediction Probabilities to Prediction Labels with a Multi-Class Model11:01

BEGIN

Training a Multi-Class Classification Model and Troubleshooting Code on the Fly16:17

BEGIN

Making Predictions with and Evaluating Our Multi-Class Classification Model7:59

BEGIN

Discussing a Few More Classification Metrics9:17

BEGIN

PyTorch Classification: Exercises and Extra-Curriculum2:58

BEGIN

Course Check-In

BEGIN

Section 03: PyTorch Computer Vision

35 lectures

What Is a Computer Vision Problem and What We Are Going to Cover11:47

BEGIN

Computer Vision Input and Output Shapes10:08

BEGIN

What Is a Convolutional Neural Network (CNN)5:02

BEGIN

Discussing and Importing the Base Computer Vision Libraries in PyTorch9:19

BEGIN

Getting a Computer Vision Dataset and Checking Out Its- Input and Output Shapes14:30

BEGIN

Visualizing Random Samples of Data9:51

BEGIN

DataLoader Overview Understanding Mini-Batch7:17

BEGIN

Turning Our Datasets Into DataLoaders12:23

BEGIN

Model 0: Creating a Baseline Model with Two Linear Layers14:38

BEGIN

Creating a Loss Function: an Optimizer for Model 010:29

BEGIN

Creating a Function to Time Our Modelling Code5:34

BEGIN

Writing Training and Testing Loops for Our Batched Data21:25

BEGIN

Writing an Evaluation Function to Get Our Models Results12:58

BEGIN

Setup Device-Agnostic Code for Running Experiments on the GPU3:46

BEGIN

Model 1: Creating a Model with Non-Linear Functions9:03

BEGIN

Model 1: Creating a Loss Function and Optimizer3:04

BEGIN

Turing Our Training Loop into a Function8:28

BEGIN

Turing Our Testing Loop into a Function6:35

BEGIN

Training and Testing Model 1 with Our Training and Testing Functions11:52

BEGIN

Getting a Results Dictionary for Model 14:08

BEGIN

Model 2: Convolutional Neural Networks High Level Overview8:24

BEGIN

Model 2: Coding Our First Convolutional Neural Network with PyTorch19:48

BEGIN

Model 2: Breaking Down Conv2D Step by Step14:59

BEGIN

Model 2: Breaking Down MaxPool2D Step by Step15:48

BEGIN

Model 2: Using a Trick to Find the Input and Output Shapes of Each of Our Layers13:45

BEGIN

Model 2: Setting Up a Loss Function and Optimizer2:38

BEGIN

Model 2: Training Our First CNN and Evaluating Its Results7:54

BEGIN

Comparing the Results of Our Modelling Experiments7:23

BEGIN

Making Predictions on Random Test Samples with the Best Trained Model11:39

BEGIN

Plotting Our Best Model Predictions on Random Test Samples and Evaluating Them8:10

BEGIN

Making Predictions Across the Whole Test Dataset and Importing Libraries to Plot a Confusion Matrix15:20

BEGIN

Evaluating Our Best Models Predictions with a Confusion Matrix6:54

BEGIN

Saving and Loading Our Best Performing Model11:27

BEGIN

Recapping What We Have Covered Plus Exercises and Extra-Curriculum6:01

BEGIN

Implement a New Life System

BEGIN

Section 04: PyTorch Custom Datasets

38 lectures

What Is a Custom Dataset and What We Are Going to Cover9:53

BEGIN

Importing PyTorch and Setting Up Device-Agnostic Code5:54

BEGIN

Downloading a Custom Dataset of Pizza, Steak and Sushi Images14:04

BEGIN

Becoming One With the Data (Part 1): Exploring the Data Format8:41

BEGIN

Becoming One With the Data (Part 2): Visualizing a Random Image11:40

BEGIN

Becoming One With the Data (Part 3): Visualizing a Random Image with Matplotlib4:47

BEGIN

Transforming Data (Part 1): Turning Images Into Tensors8:53

BEGIN

Transforming Data (Part 2): Visualizing Transformed Images11:30

BEGIN

Loading All of Our Images and Turning Them Into Tensors With ImageFolder9:17

BEGIN

Visualizing a Loaded Image From the Train Dataset7:18

BEGIN

Turning Our Image Datasets into PyTorch DataLoaders9:03

BEGIN

Creating a Custom Dataset Class in PyTorch High Level Overview7:59

BEGIN

Creating a Helper Function to Get Class Names From a Directory9:06

BEGIN

Writing a PyTorch Custom Dataset Class from Scratch to Load Our Images17:46

BEGIN

Compare Our Custom Dataset Class to the Original ImageFolder Class7:13

BEGIN

Writing a Helper Function to Visualize Random Images from Our Custom Dataset14:18

BEGIN

Turning Our Custom Datasets Into DataLoaders6:58

BEGIN

Exploring State of the Art Data Augmentation With Torchvision Transforms14:23

BEGIN

Building a Baseline Model (Part 1): Loading and Transforming Data8:15

BEGIN

Building a Baseline Model (Part 2): Replicating Tiny VGG from Scratch11:24

BEGIN

Building a Baseline Model (Part 3): Doing a Forward Pass to Test Our Model Shapes8:09

BEGIN

Using the Torchinfo Package to Get a Summary of Our Model6:38

BEGIN

Creating Training and Testing loop Functions13:03

BEGIN

Creating a Train Function to Train and Evaluate Our Models10:14

BEGIN

Training and Evaluating Model 0 With Our Training Functions9:53

BEGIN

Plotting the Loss Curves of Model 09:02

BEGIN

Discussing the Balance Between Overfitting and Underfitting and How to Deal With Each14:13

BEGIN

Creating Augmented Training Datasets and DataLoaders for Model 111:03

BEGIN

Constructing and Training Model 17:10

BEGIN

Plotting the Loss Curves of Model 13:22

BEGIN

Plotting the Loss Curves of All of Our Models Against Each Other10:55

BEGIN

Predicting on Custom Data (Part 1): Downloading an Image5:32

BEGIN

Predicting on Custom Data (Part2): Loading In a Custom Image With PyTorch7:00

BEGIN

Predicting on Custom Data (Part 3): Getting Our Custom Image Into the Right Format14:06

BEGIN

Predicting on Custom Data (Part 4): Turning Our Models Raw Outputs Into Prediction Labels4:24

BEGIN

Predicting on Custom Data (Part 5): Putting It All Together12:47

BEGIN

Summary of What We Have Covered Plus Exercises and Extra-Curriculum6:04

BEGIN

Exercise: Imposter Syndrome2:55

BEGIN

Section 05: PyTorch Going Modular

10 lectures

What Is Going Modular and What We Are Going to Cover11:34

BEGIN

Going Modular Notebook (Part 1): Running It End to End7:39

BEGIN

Downloading a Dataset4:49

BEGIN

Writing the Outline for Our First Python Script to Setup the Data13:50

BEGIN

Creating a Python Script to Create Our PyTorch DataLoaders10:35

BEGIN

Turning Our Model Building Code into a Python Script9:18

BEGIN

Turning Our Model Training Code into a Python Script6:16

BEGIN

Turning Our Utility Function to Save a Model into a Python Script6:06

BEGIN

Creating a Training Script to Train Our Model in One Line of Code15:46

BEGIN

Going Modular: Summary, Exercises and Extra-Curriculum5:59

BEGIN

Section 06: PyTorch Transfer Learning

19 lectures

Introduction: What is Transfer Learning and Why Use It10:05

BEGIN

Where Can You Find Pretrained Models and What We Are Going to Cover5:12

BEGIN

Installing the Latest Versions of Torch and Torchvision8:05

BEGIN

Downloading Our Previously Written Code from Going Modular6:41

BEGIN

Downloading Pizza, Steak, Sushi Image Data from Github8:00

BEGIN

Turning Our Data into DataLoaders with Manually Created Transforms14:40

BEGIN

Turning Our Data into DataLoaders with Automatic Created Transforms13:06

BEGIN

Which Pretrained Model Should You Use12:15

BEGIN

Setting Up a Pretrained Model with Torchvision10:57

BEGIN

Different Kinds of Transfer Learning7:11

BEGIN

Getting a Summary of the Different Layers of Our Model6:49

BEGIN

Freezing the Base Layers of Our Model and Updating the Classifier Head13:26

BEGIN

Training Our First Transfer Learning Feature Extractor Model7:54

BEGIN

Plotting the Loss Curves of Our Transfer Learning Model6:26

BEGIN

Outlining the Steps to Make Predictions on the Test Images7:57

BEGIN

Creating a Function Predict On and Plot Images10:00

BEGIN

Making and Plotting Predictions on Test Images7:23

BEGIN

Making a Prediction on a Custom Image6:21

BEGIN

Main Takeaways, Exercises and Extra Curriculum3:21

BEGIN

Section 07: PyTorch Experiment Tracking

22 lectures

What Is Experiment Tracking and Why Track Experiments7:06

BEGIN

Getting Setup by Importing Torch Libraries and Going Modular Code8:13

BEGIN

Creating a Function to Download Data10:23

BEGIN

Turning Our Data into DataLoaders Using Manual Transforms8:30

BEGIN

Turning Our Data into DataLoaders Using Automatic Transforms7:47

BEGIN

Preparing a Pretrained Model for Our Own Problem10:28

BEGIN

Setting Up a Way to Track a Single Model Experiment with TensorBoard13:35

BEGIN

Training a Single Model and Saving the Results to TensorBoard4:38

BEGIN

Exploring Our Single Models Results with TensorBoard10:17

BEGIN

Creating a Function to Create SummaryWriter Instances10:45

BEGIN

Adapting Our Train Function to Be Able to Track Multiple Experiments4:57

BEGIN

What Experiments Should You Try5:59

BEGIN

Discussing the Experiments We Are Going to Try6:01

BEGIN

Downloading Datasets for Our Modelling Experiments6:31

BEGIN

Turning Our Datasets into DataLoaders Ready for Experimentation8:28

BEGIN

Creating Functions to Prepare Our Feature Extractor Models15:54

BEGIN

Coding Out the Steps to Run a Series of Modelling Experiments14:27

BEGIN

Running Eight Different Modelling Experiments in 5 Minutes3:50

BEGIN

Viewing Our Modelling Experiments in TensorBoard13:38

BEGIN

Loading In the Best Model and Making Predictions on Random Images from the Test Set10:32

BEGIN

Making a Prediction on Our Own Custom Image with the Best Model3:44

BEGIN

Main Takeaways, Exercises and Extra Curriculum3:56

BEGIN

Section 08: PyTorch Paper Replicating

50 lectures

What Is a Machine Learning Research Paper?7:34

BEGIN

Why Replicate a Machine Learning Research Paper?3:13

BEGIN

Where Can You Find Machine Learning Research Papers and Code?8:18

BEGIN

What We Are Going to Cover8:21

BEGIN

Getting Setup for Coding in Google Colab8:21

BEGIN

Downloading Data for Food Vision Mini4:02

BEGIN

Turning Our Food Vision Mini Images into PyTorch DataLoaders9:47

BEGIN

Visualizing a Single Image3:45

BEGIN

Replicating a Vision Transformer - High Level Overview9:53

BEGIN

Breaking Down Figure 1 of the ViT Paper11:12

BEGIN

Breaking Down the Four Equations Overview and a Trick for Reading Papers10:55

BEGIN

Breaking Down Equation 18:14

BEGIN

Breaking Down Equations 2 and 310:03

BEGIN

Breaking Down Equation 47:27

BEGIN

Breaking Down Table 111:05

BEGIN

Calculating the Input and Output Shape of the Embedding Layer by Hand15:41

BEGIN

Turning a Single Image into Patches (Part 1: Patching the Top Row)15:03

BEGIN

Turning a Single Image into Patches (Part 2: Patching the Entire Image)12:33

BEGIN

Creating Patch Embeddings with a Convolutional Layer13:33

BEGIN

Exploring the Outputs of Our Convolutional Patch Embedding Layer12:54

BEGIN

Flattening Our Convolutional Feature Maps into a Sequence of Patch Embeddings9:59

BEGIN

Visualizing a Single Sequence Vector of Patch Embeddings5:03

BEGIN

Creating the Patch Embedding Layer with PyTorch17:01

BEGIN

Creating the Class Token Embedding13:24

BEGIN

Creating the Class Token Embedding - Less Birds13:24

BEGIN

Creating the Position Embedding11:25

BEGIN

Equation 1: Putting it All Together13:25

BEGIN

Equation 2: Multihead Attention Overview14:30

BEGIN

Equation 2: Layernorm Overview9:03

BEGIN

Turning Equation 2 into Code14:33

BEGIN

Checking the Inputs and Outputs of Equation5:40

BEGIN

Equation 3: Replication Overview9:11

BEGIN

Turning Equation 3 into Code11:25

BEGIN

Transformer Encoder Overview8:50

BEGIN

Combining Equation 2 and 3 to Create the Transformer Encoder9:16

BEGIN

Creating a Transformer Encoder Layer with In-Built PyTorch Layer15:54

BEGIN

Bringing Our Own Vision Transformer to Life - Part 1: Gathering the Pieces of the Puzzle18:19

BEGIN

Bringing Our Own Vision Transformer to Life - Part 2: Putting Together the Forward Method10:41

BEGIN

Getting a Visual Summary of Our Custom Vision Transformer7:13

BEGIN

Creating a Loss Function and Optimizer from the ViT Paper11:26

BEGIN

Training our Custom ViT on Food Vision Mini4:29

BEGIN

Discussing what Our Training Setup Is Missing9:08

BEGIN

Plotting a Loss Curve for Our ViT Model6:13

BEGIN

Getting a Pretrained Vision Transformer from Torchvision and Setting it Up14:37

BEGIN

Preparing Data to Be Used with a Pretrained ViT5:53

BEGIN

Training a Pretrained ViT Feature Extractor Model for Food Vision Mini7:15

BEGIN

Saving Our Pretrained ViT Model to File and Inspecting Its Size5:13

BEGIN

Discussing the Trade-Offs Between Using a Larger Model for Deployments3:46

BEGIN

Making Predictions on a Custom Image with Our Pretrained ViT3:30

BEGIN

PyTorch Paper Replicating: Main Takeaways, Exercises and Extra-Curriculum6:50

BEGIN

Section 09: PyTorch Model Deployment

57 lectures

What is Machine Learning Model Deployment and Why Deploy a Machine Learning Model9:35

BEGIN

Three Questions to Ask for Machine Learning Model Deployment7:13

BEGIN

Where Is My Model Going to Go?13:34

BEGIN

How Is My Model Going to Function?7:59

BEGIN

Some Tools and Places to Deploy Machine Learning Models5:49

BEGIN

What We Are Going to Cover4:01

BEGIN

Getting Setup to Code6:15

BEGIN

Downloading a Dataset for Food Vision Mini3:23

BEGIN

Outlining Our Food Vision Mini Deployment Goals and Modelling Experiments7:59

BEGIN

Creating an EffNetB2 Feature Extractor Model9:45

BEGIN

Create a Function to Make an EffNetB2 Feature Extractor Model and Transforms6:29

BEGIN

Creating DataLoaders for EffNetB23:31

BEGIN

Training Our EffNetB2 Feature Extractor and Inspecting the Loss Curves9:15

BEGIN

Saving Our EffNetB2 Model to File3:24

BEGIN

Getting the Size of Our EffNetB2 Model in Megabytes5:51

BEGIN

Collecting Important Statistics and Performance Metrics for Our EffNetB2 Model6:34

BEGIN

Creating a Vision Transformer Feature Extractor Model7:51

BEGIN

Creating DataLoaders for Our ViT Feature Extractor Model2:30

BEGIN

Training Our ViT Feature Extractor Model and Inspecting Its Loss Curves6:19

BEGIN

Saving Our ViT Feature Extractor and Inspecting Its Size5:08

BEGIN

Collecting Stats About Our ViT Feature Extractor5:51

BEGIN

Outlining the Steps for Making and Timing Predictions for Our Models11:15

BEGIN

Creating a Function to Make and Time Predictions with Our Models16:20

BEGIN

Making and Timing Predictions with EffNetB210:43

BEGIN

Making and Timing Predictions with ViT7:34

BEGIN

Comparing EffNetB2 and ViT Model Statistics11:31

BEGIN

Visualizing the Performance vs Speed Trade-off15:54

BEGIN

Gradio Overview and Installation8:39

BEGIN

Gradio Function Outline8:49

BEGIN

Creating a Predict Function to Map Our Food Vision Mini Inputs to Outputs9:51

BEGIN

Creating a List of Examples to Pass to Our Gradio Demo5:26

BEGIN

Bringing Food Vision Mini to Life in a Live Web Application12:12

BEGIN

Getting Ready to Deploy Our App Hugging Face Spaces Overview6:26

BEGIN

Outlining the File Structure of Our Deployed App8:11

BEGIN

Creating a Food Vision Mini Demo Directory to House Our App Files4:11

BEGIN

Creating an Examples Directory with Example Food Vision Mini Images9:13

BEGIN

Writing Code to Move Our Saved EffNetB2 Model File7:42

BEGIN

Turning Our EffNetB2 Model Creation Function Into a Python Script4:01

BEGIN

Turning Our Food Vision Mini Demo App Into a Python Script13:27

BEGIN

Creating a Requirements File for Our Food Vision Mini App4:11

BEGIN

Downloading Our Food Vision Mini App Files from Google Colab11:30

BEGIN

Uploading Our Food Vision Mini App to Hugging Face Spaces Programmatically13:36

BEGIN

Running Food Vision Mini on Hugging Face Spaces and Trying it Out7:44

BEGIN

Food Vision Big Project Outline4:17

BEGIN

Preparing an EffNetB2 Feature Extractor Model for Food Vision Big9:38

BEGIN

Downloading the Food 101 Dataset7:45

BEGIN

Creating a Function to Split Our Food 101 Dataset into Smaller Portions13:36

BEGIN

Turning Our Food 101 Datasets into DataLoaders7:23

BEGIN

Training Food Vision Big: Our Biggest Model Yet!20:15

BEGIN

Outlining the File Structure for Our Food Vision Big5:48

BEGIN

Downloading an Example Image and Moving Our Food Vision Big Model File3:33

BEGIN

Saving Food 101 Class Names to a Text File and Reading them Back In6:56

BEGIN

Turning Our EffNetB2 Feature Extractor Creation Function into a Python Script2:20

BEGIN

Creating an App Script for Our Food Vision Big Model Gradio Demo10:41

BEGIN

Zipping and Downloading Our Food Vision Big App Files3:45

BEGIN

Deploying Food Vision Big to Hugging Face Spaces13:34

BEGIN

PyTorch Mode Deployment: Main Takeaways, Extra-Curriculum and Exercises6:13

BEGIN

Introduction to PyTorch 2.0 and torch.compile

24 lectures

Introduction to PyTorch 2.06:01

BEGIN

What We Are Going to Cover and PyTorch 2 Reference Materials1:21

BEGIN

Getting Started with PyTorch 2.0 in Google Colab4:19

BEGIN

PyTorch 2.0 - 30 Second Intro3:20

BEGIN

Getting Setup for PyTorch 2.02:22

BEGIN

Getting Info from Our GPUs and Seeing if They're Capable of Using PyTorch 2.06:49

BEGIN

Setting the Default Device in PyTorch 2.09:40

BEGIN

Discussing the Experiments We Are Going to Run for PyTorch 2.06:42

BEGIN

Creating a Function to Setup Our Model and Transforms10:17

BEGIN

Discussing How to Get Better Relative Speedups for Training Models8:23

BEGIN

Setting the Batch Size and Data Size Programmatically7:15

BEGIN

Getting More Speedups with TensorFloat-329:53

BEGIN

Downloading the CIFAR10 Dataset7:00

BEGIN

Creating Training and Test DataLoaders7:38

BEGIN

Preparing Training and Testing Loops with Timing Steps4:58

BEGIN

Experiment 1 - Single Run without Torch Compile8:22

BEGIN

Experiment 2 - Single Run with Torch Compile10:38

BEGIN

Comparing the Results of Experiments 1 and 211:19

BEGIN

Saving the Results of Experiments 1 and 24:39

BEGIN

Preparing Functions for Experiments 3 and 412:41

BEGIN

Experiment 3 - Training a Non-Compiled Model for Multiple Runs12:44

BEGIN

Experiment 4 - Training a Compiled Model for Multiple Runs9:57

BEGIN

Comparing the Results of Experiments 3 and 45:23

BEGIN

Potential Extensions and Resources to Learn More5:50

BEGIN

Where To Go From Here?

6 lectures

Thank You!1:17

BEGIN

Review This Course!

BEGIN

Become An Alumni

BEGIN

Learning Guideline

BEGIN

ZTM Events Every Month

BEGIN

LinkedIn Endorsements

BEGIN

Meet your instructor

Your PyTorch instructor (Daniel) isn't just a machine learning engineer with years of real-world professional experience. He has been in your shoes. He makes learning fun. He makes complex topics feel simple. He will motivate you. He will push you. And he will go above and beyond to help you succeed.

Daniel Bourke

Hi, I'm Daniel Bourke!

Daniel, a self-taught Machine Learning Engineer, has worked at one of Australia's fastest-growing artificial intelligence agencies, Max Kelsen, and is now using his expertise to teach thousands of students data science and machine learning.

SEE MY BIO & COURSES

Daniel Bourke

Machine Learning Engineer

Frequently asked questions

Are there any prerequisites for this course?

Required:

  • A computer (Linux/Windows/Mac) with an internet connection
  • Basic Python knowledge

Recommended:

  • Previous Machine Learning knowledge is recommended, but not required. Daniel provides sufficient supplementary resources to get you up-to-speed
  • Experience using Jupyter Notebooks or Google Colab is recommended

If you have no previous Machine Learning or Python experience, you can start with Daniel's Machine Learning Bootcamp which is also included with your ZTM Academy membership.

Who is this course for?

  • Anyone who wants a step-by-step guide to learning PyTorch and be able to get hired as a Deep Learning Engineer making over $100,000 / year
  • Students, developers, and data scientists who want to demonstrate practical machine learning skills by actually building and training real models using PyTorch
  • Anyone looking to expand their knowledge and toolkit when it comes to AI, Machine Learning and Deep Learning
  • Bootcamp or online PyTorch tutorial graduates that want to go beyond the basics
  • Students who are frustrated with their current progress with all of the beginner PyTorch tutorials out there that don't go beyond the basics and don't give you real-world practice or skills you need to actually get hired

Do you provide a certificate of completion?

We definitely do and they are quite nice. You will also be able to add Zero To Mastery Academy to the education section of your LinkedIn profile as well.

Can I use the course projects in my portfolio?

Yes, you’d be crazy not to in our slightly biased opinion! All projects are downloadable and ready to use the minute you join.

Many of our students tell us the projects they built while following along with our courses were what got them interviews and because they built the projects themselves, they could confidently explain and walk through their work during the interview.

You know what that means? Job offer!

Are there subtitles?

Yes! We have high quality subtitles in 11 different languages: English, Spanish, French, German, Dutch, Romanian, Arabic, Hindi, Portuguese, Indonesian, and Japanese.

You can even adjust the text size, color, background and more so that the subtitles are perfect just for you!

Still have more questions about the Academy?

Still have more questions specific to the Academy membership? No problem, we answer some more here.

Invest in a better you. For less than a coffee a day.

Choose your currency:
$ USD US Dollar
Lifetime
100% OFF$999
$999
Only pay once, ever
You're serious about advancing your career and never getting left behind
Start Learning Now

MOST POPULAR

Save 40% vs. monthly (that's $189 a year)
Annual
100% OFF$279 / year
$23 / month
$279 / year
You're committed to getting hired and starting a career in tech
Start Learning Now
Monthly
100% OFF$39 / month
$39 / month
You're ready to upskill and advance your career
Start Learning Now

Every ZTM membership includes:

Unlimited access to all courses, projects + workshops, and career paths
Access to our private Discord with 400,000+ members
Access to our private LinkedIn networking group
Custom ZTM course completion certificates
Live career advice sessions with mentors, every month
Full access to all future courses, content, and features
100% RISK FREE

We know you'll love ZTM. That's why we provide a no hassle, 30-day money back guarantee.