Advanced Methods for Hyperparameter Tuning

preview_player
Показать описание
In this video, we learn how to tune hyperparameters of the network with advanced methods like Bayesian search, gradient-based search, and evolutionary computing.

❓To get the most out of the course, don't forget to answer the end of module questions:

👉 You can find the answers here:

RESOURCES:

COURSES:

Рекомендации по теме
Комментарии
Автор

Hello Misra,

I must say, your videos are becoming quite the saga for me! It's like a suspenseful movie series that I can't get enough of. Especially that model you showcased with its 34% accuracy – talk about leaving us on the edge of our seats! Are we going to witness an optimization showdown in the upcoming episodes? The anticipation is real. Can't wait to see if you'll circle back to that captivating moment in the coming videos. Keep the excitement coming!

Cheers,

Tehrani
Автор

Excellent instructional videos. Regarding tuning a neural network, how does one know (if it is possible to know) what the optimal result could be? For example, sometimes the data itself does not allow 95% accuracy, and the Bayesian Optimal result is 75% accuracy, in which case it would be a waste of time, effort and resources to try and push farther than 75%. Thx.

AKS
Автор

I appreciate and am glad that you shared with us what you know💗💗💗

theusercametochill
Автор

Thanks for the sharing. This is Awesome! Any idea on how effective the nature inspired optimization technique, I see it widely in many research papers like whale optimizer, reptile search, swarm based, etc.

chakra-ai
Автор

Unfortunately, most of the Python libraries mentioned in this video are outdated and some have not been updated on github for many years...

franky
Автор

very fast.
Little more example with code could have been better

pra
Автор

Evrimci algoritmalar ilgimi çekiyor, sklearn-deap'i deneyeceğim

bay-bicerdover