PyTorch Tutorial for Beginners | Basics & Gradient Descent | Tensors, Autograd & Linear Regression

preview_player
Показать описание
💻 A beginner-friendly approach to PyTorch basics: Tensors, Gradient, Autograd etc
🛠 Working on Linear Regression & Gradient descent from scratch

Code and Resources:

This tutorial is an executable Jupyter notebook hosted on Jovian (don't worry if these terms seem unfamiliar; we'll learn more about them soon). You can run this tutorial and experiment with the code examples in a couple of ways: using free online resources (Google Colab, Kaggle, Binder) or on your local machine.

Time Breaks:
00:00 Introduction
04:45 PyTorch Basics and Gradients
05:47 Tensors
16:31 Tensor Functions
18:55 Interoperability with NumPy
23:36 Summary and Further Reading
27:34 Gradient Descent and Linear Regression
28:20 Linear Regression
46:58 Computing Gradients
1:06:02 Dataset and DataLoader
1:19:09 Machine Learning vs Classical Programming
1:25:05 Assignment 1 - All about torch.Tensor
1:33:39 Jovian Mentorship Program
1:34:36 Course Overview
1:36:47 What to do next?

Topics covered in this video:
⌨️ Introduction to machine learning and Jupyter notebooks
⌨️ PyTorch basics: tensors, gradients, and autograd
⌨️ Linear regression & gradient descent from scratch

Deep Learning with PyTorch: Zero to GANs is a beginner-friendly online course offering a practical and coding-focused introduction to deep learning using the PyTorch framework.

This course is taught by Aakash N S, co-founder & CEO of Jovian - a platform for sharing, showcasing and collaborating on data science projects online

--

#PyTorch #DeepLearning #LinerRegression #GradientDescent #Python #Certification
Рекомендации по теме
Комментарии
Автор

I went through a lot of course materials. This one is the best.
I really appreciate that the instructor has been so keenly covering the topics and the assignment in which we are told to learn not only the syntaxes and usage, but also produce an error to better understand all the aspects of function is really intuitive. Thanks again

vishalbothra
Автор

You're explanation of gradient descent is just too beautiful. Well explained. Thank you.

zigzag
Автор

I am taking Deep Learning module in the university and this video cleared up a lot of things that we passed during the lecture (mainly because of time limitations). Thank you!

alibaba
Автор

What a tutorial. Just too good. Absolutely loved it

Arun-Home
Автор

One of the best tutorial on gradient descent using PyTorch. Just wow

ngchingjie
Автор

Explained Linear regression and Computing gradients in a great way by taking an example. Thanks, it really helped.

ShivaniKanamarlapudi
Автор

thx for the free tutorials!!
am I the only one who notices the Pokeman reference used in the sample data, lol

DiYuan
Автор

Even better than chatgpt, India bring the best tech people ❤ 🇸🇦

raghadalqobali
Автор

thanks for this video it was awesome even better than some of paid video pls try to teach tensorflow also

anuragshrivastava
Автор

I am fanboy of Akash's teaching style. Last I liked was Mitesh Khapra, a senior facluty from IITM.

siddharthpangotra
Автор

This course is actually great and helpful 👍

urpaps
Автор

why did you define a (2, 3) matrix and then transpose it during multiplication instead of directly specifying a (3, 2) matrix for the weights during the linear regression at 36:00 ??

rosisneupane
Автор

God bless you brother. You are amazing

ashish-blessings
Автор

Can I still get a certificate on completion of this course? I enrolled today.

areebhussain
Автор

Hi, when trying to decrease the loss by subtracting the gradients the loss doesn't change. I thought I was doing something wrong but I ran your notebook as well and same thing. I don't know if the newer version of pytorch changed something. But the loss stays the same for some reason

Edit: In the first part of the gradient descent the input was not passed through the model but then when you passes new inputs into the model the los decreased as it should.

joekanaan
Автор

What is the difference between performing linear regression like this and perofrming linear regression using sklearn?

garvitpoddar
Автор

At 1:12:41, when you are generating predictions, you are using the pre-defined "model" function. So it is using w and b values created manually using randn. Not from nn.linear. Maybe you wanna check that.

aination
Автор

How can I visualize the 5 - nodes weighted graph on 3x3 matrix?

subhadeeppaul
Автор

Can i still do this course and will i get certificate after completion ?

geetapandey
Автор

While back propogating are we taking the derivatives wrt x or wrt the loss functions

datascience