filmov
tv
Dive Into Deep Learning, Lecture 2: PyTorch Automatic Differentiation (torch.autograd and backward)
Показать описание
In this video, we discuss PyTorch’s automatic differentiation engine that powers neural networks and deep learning training (for stochastic gradient descent). In this section, you will get a conceptual understanding of how autograd works to find the gradient of multivariable functions. We start by discussing derivatives, partial derivatives, and the definition of gradients. We then discuss how to compute gradients using requires_grad=True and the backward() method. Thus, we cover classes and functions implementing automatic differentiation of arbitrary scalar-valued and non-scalar-valued functions. We also discuss the Jacobian matrix in PyTorch. Differentiation is a crucial step in nearly all machine learning and deep learning optimization algorithms. While the calculations for taking these derivatives are straightforward, working out the updates by hand can be a painful and tedious task.
#Autograd #PyTorch #DeepLearning
#Autograd #PyTorch #DeepLearning
Dive into Deep Learning - Lecture 1: PyTorch Tensor Basics, Operations, Functions, and Broadcasting
Dive into Deep Learning D2L at WAIC'20
Dive Into Deep Learning - Lecture 3: Build a Simple Neural Network from Scratch with PyTorch
Dive into Deep Learning (Study Group): Introduction to Deep Learning | Session 1
Dive Into Deep Learning, Lecture 2: PyTorch Automatic Differentiation (torch.autograd and backward)
MIT Introduction to Deep Learning (2023) | 6.S191
Dive into Deep Learning: Coding Session #1 – Setup & MLP (APAC)
Dive into Deep Learning: Coding Session #1 – Setup & MLP (Americas/EMEA)
EfficientML.ai Lecture 18 - Diffusion Models (Zoom Recording) (MIT 6.5940, Fall 2024)
Dive into Deep Learning (Study Group): Convolutional Neural Networks | Session 6
Dive into Deep Learning with Scala by Sören Brunk
How I’d learn ML in 2024 (if I could start over)
Dive Into Deep Learning Session 1 | Introduction
But what is a neural network? | Chapter 1, Deep learning
Dive into Deep Learning: Coding Session #2 – CNN model (APAC)
Deeper Dive Into Deep Learning: a survey of techniques (Raphael Gontijo Lopes)
Dive into Deep Learning - Lecture 4: Logistic/Softmax regression and Cross Entropy Loss with PyTorch
Dive into Deep Learning – Lec 6: Basics of Object-Oriented Programming in PyTorch (torch.nn.Module)...
Dive Into Deep Learning - Lecture 5: Parameter Access, Initialization, and storage in PyTorch
Dive into Deep Learning: Coding Session#5 Attention Mechanism II (Americas/EMEA)
Deep Dive into Deep Learning Pipelines continues - Sue Ann Hong & Tim Hunter
NLP | Dive into Deep Learning for NLP - Part 1
Deep Dive into Deep Learning Pipelines - Sue Ann Hong & Tim Hunter
Dive into Deep Learning Lec7: Regularization in PyTorch from Scratch (Custom Loss Function Autograd)
Комментарии