PyTorch Most Common Techniques for Deep Learning | Data Science | Machine Learning

preview_player
Показать описание
🔥🐍 Checkout the MASSIVELY UPGRADED 2nd Edition of my Book (with 1300+ pages of Dense Python Knowledge) Covering 350+ Python 🐍 Core concepts

---------------------

👉 Chapters

@00:00:38 - Weight Decay in Neural Network with PyTorch

@00:17:45 - Label Smoothing Cross-Entropy-Loss from Scratch with PyTorch

@00:28:09 - Dropout Layer in PyTorch Neural Network

@00:32:58 - Get Total Number of Parameters in a PyTorch Neural Network Model

@00:38:25 - PyTorch Tensor vs Numpy array & detach-method of PyTorch

@00:44:33 - Cross Entropy Loss in PyTorch and its relation with Softmax

@00:59:04 - Lazy Linear Module in PyTorch

@01:04:58 - Input Shape of Tensor in Neural Network for PyTorch

@01:36:19 - Plotting Activation Functions | PyTorch | Sigmoid | ReLU | Tanh | Neural Network

@01:41:42 - Quantization in PyTorch | Mixed Precision Training

@02:02:09 - LeNet from Scratch - Shape Calculation @ each Layer

@02:21:03 - Learning Rate Scheduler | PyTorch | Implementing Custom Scheduler for CycleGAN

@02:47:21 - Hausdorff Distance used in Object Detection | Semantic Segmentation

@03:29:18 - Pixel Accuracy in Image Segmentation | Object Detection

@03:36:46 - Mean-IoU (Intersection over Union) in Object Detection | Semantic Segmentation

======================

You can find me here:

**********************************************

**********************************************

Other Playlist you might like 👇

#machinelearning #deeplearning #datascience #finance #python #kaggle #tensorflow #pytorch #100daysofmlcode #pythonprogramming #100DaysOfMLCode #AI #trading #bitcoin
Рекомендации по теме
Комментарии
Автор

Hi Rohan, thank you so much. I just can’t stop appreciating you. You are my hero.

rakiop
Автор

Apology for the multiple intro at the beginning of each chapter in the video. I forgot to edit those out 😓

RohanPaul-AI
Автор

amazing channel (y), may I request, can you make a video explaining the effect of batch size, learning rate and optimizer (ex: Adam, AdamW) on deep learning, hehehe....

saluangja