Logistic regression - Best choice of learning rate part 9

preview_player
Показать описание
In order for Gradient Descent to work, we must choose the learning rate wisely. The learning rate α determines how rapidly we update the parameters. If the learning rate is too large we may "overshoot" the optimal value. Similarly, if it is too small we will need too many iterations to converge to the best values. That's why it is crucial to use a well-tuned learning rate.

So we'll compare the learning curve of our model with several choices of learning rates.

So finally we made the simplest Logistic Regression model with a neural network mindset. If you would like to test more with it you can play with the learning rate and the number of iterations, you can try different initialization methods and compare the results. This was the last tutorial series for Logistic Regression, next we'll start building a simple neural network!

✅ Support My Channel Through Patreon:

✅ One-Time Contribution Through PayPal:
Рекомендации по теме