Backpropagation And Gradient Descent In Neural Networks | Neural Network Tutorial | Simplilearn

preview_player
Показать описание

This video on backpropagation and gradient descent will cover the basics of how backpropagation and gradient descent plays a role in training neural networks - using an example on how to recognize the handwritten digits using a neural network. After predicting the results, you will see how to train the network using backpropagation to obtain the results with high accuracy. Backpropagation is the process of updating the parameters of a network to reduce the error in prediction. You will also understand how to calculate the loss function to measure the error in the model. Finally, you will see with the help of a graph, how to find the minimum of a function using gradient descent.

#BackpropagationAndGradientDescent #BackpropagationInNeuralNetworks #Backpropagation #BackpropagationAlgorithm #BackpropagationExample #DeepLearningTutorial #DataScience #SimplilearnDeepLearning #DeepLearningCourse

Simplilearn’s Deep Learning course will transform you into an expert in deep learning techniques using TensorFlow, the open-source software library designed to conduct machine learning & deep neural network research. With our deep learning course, you'll master deep learning and TensorFlow concepts, learn to implement algorithms, build artificial neural networks and traverse layers of data abstraction to understand the power of data and prepare you for your new role as deep learning scientist.

We recommend this deep learning online course particularly for the following professionals:
1. Software engineers
2. Data scientists
3. Data analysts
4. Statisticians with an interest in deep learning

Рекомендации по теме
Комментарии
Автор

After wasting several hours trying to understand these 2 concepts, this video has finally explained exceptionally well. Thank you.

GbSharp
Автор

this video is awesome.but there are some mistakes at 7:53 (should be negative slope) and at 8:21 (weight needs to be increase instead reduce)

coxixx
Автор

Great video! But I do have some questions.

1) At 1:20 Why don't some of the probabilities add up to 1?

2) At 6:11 Isn't the slope negative? From what I understand, the slope indicates how a line changes at a point. At 6:11 the line is going down, so the slope would be negative. At 6:17 the line is going up, so the slope would be positive. If I remember correctly from Google's ML crash course, we move in the opposite direction of the slope (multiply by -1).

victorluna
Автор

thanks for the video. a whole day lecture in 12 mins

arifmoazy
Автор

Im confused, I think you mixed up a positive and negative slopes here . Positive slope should decrease in the weight and vice versa

rainyy
Автор

really good video video


however i keep feeling like some of the most essential things are being left out. Namely what actually happens when the weights are being "updated". lets say whe have the three losses like in your example:
1.0.49
2.0.25
3.0.04
Now when i update the weights using these losses, what is acutally happening?
Am i going through all the weigth and saying w*l. And in that case which of the three losses?


Or isit being subtracted or added, or what in the world is going on???

bubblesgrappling
Автор

I love these short segments on Deep Learning please keep them coming!

druestaples
Автор

so a point on the graph is the result of 1 training example? so if we only have 1 example we wouldn't know a gradient because you need 2 points for that, right? so we can't train the net with 1 image?

JazevoAudiosurf
Автор

nice ! i wish explain how exactly weights to change in back-propagation.

coxixx
Автор

How do you know if it is a positive slope or a negative slope?

vncntjms
Автор

Amazing lesson.. very clear and helpful explanation

solmaznaderi
Автор

Why is the sound so bad on this video?

Stephienova
Автор

This video is ML for housewives, does not actually explain the math behind doing gradient descent on a neural network, just the concept

jaredjunkin