In-Depth Look: ResNet Architecture and Residual Block Explained

preview_player
Показать описание

You will also get access to all the technical courses inside the program, also the ones I plan to make in the future! Check out the technical courses below 👇

_____________________________________________________________

In this video 📝 we will talk about the ResNet Architecture. Residual Neural Networks are often used to solve computer vision problems and consist of several residual blocks. We will talk about what a residual block is and compare it to the architecture of a standard convolutional neural network. I'll show you how you can use the pre-trained ResNets from Keras and TensorFlow.

If you enjoyed this video, be sure to press the 👍 button so that I know what content you guys like to see.

_____________________________________________________________

_____________________________________________________________

_____________________________________________________________

📞 Connect with Me:

_____________________________________________________________

🎮 My Gear (Affiliate links):
🖥️ Desktop PC:

_____________________________________________________________

Tags:
#ResNet #ResidualBlock #NeuralNetworks #DeepLearning #NeuralNetworksPython #NeuralNetworksTutorial #DeepLearningTutorial #Keras #TensorFlow
Рекомендации по теме
Комментарии
Автор

Join My AI Career Program
Enroll in My School and Technical Courses

NicolaiAI
Автор

You clearly know what you're talking about, but your presentation's a mess - you should write, and stick to, a script.
Don't be afraid to take a breath, it's like you're rushing to get your sentences out.

Nagrom
Автор

Thanks for the video. I think I finally understand why they are so effective. I'd love to look at a simple trained resnet so I can analyze the weights and see what's happening

mfpears
Автор

I love the way he explain this architecture, helps me a lot to understand ResNet.

huangshijie
Автор

Helped me a lot. Keep on making great stuff like this!!

itsfabiolous
Автор

here counter - 161 times in 13 minutes
12 here/minute
which turns to 1 here every 5th second.

Graveness
Автор

Very good explanation. It will help me in research.

jagyansenipandapanda
Автор

Has the resnet-1000 architecture been built?

PE-gwgu
Автор

Totally don't understand why it's hard to compute the direct mapping in the standard CNN, or more like, why it's easy to do it with residual layers - in the residual layer, in order to get f(X) + X, you still have to compute f(X), right?
Furthermore, what do you mean by direct mapping? Is it just f(X)?

petarulev