Automatic Neural Network Hyperparameter Tuning for TensorFlow Models using Keras Tuner in Python

preview_player
Показать описание


Don't forget to subscribe if you enjoyed the video :D
Рекомендации по теме
Комментарии
Автор

I like how you explain what each portion of the code means and where it comes from as you write it. It makes understanding what everything is doing so much better!

lukasshipley
Автор

I’m using this for a CNN. Thanks for the clear presentation and walk through. You’re the man

proterotype
Автор

thank you for this extremely useful video!! it cleared up all my confusions regarding hyperparameter tuning!!

shrutihegde
Автор

Thank you Greg. That's very helpful.

drjabirrahman
Автор

actually incredibly useful thank you so much

ithaca
Автор

Hi I am having Problems with the Epochs and Number of test run. For e.g. when i want to have it adjust the dropout from 0 to 0.5 steps 0.1 it runs the search for each options with 2 epochs and then stops. So basically nothing is really tested it and it stops prematurely.

max_epochs could be set to 100 or even it does nothing. Settings the epoch in search does literally nothing i could even set it to -1 or even a string of gibberish.

But when i run a search where i configured a lot in Hyperband(including cnn stride, filter sz, kernel, etc.) it works fine and does what is should.

The more i try the more im

thefruitsofzei
Автор

Hello.
What if I want to optimize the batch size, the number of epochs or use callbacks such as ReduceLROnPlateau or EarlyStopping? Can I use keras tuner for that? Or is it already included in the search procedure?

juanete
Автор

Amazing Viedo! Thank you so much! Saved my ass of my assignment 😂

hawaiisunkissselftanner
Автор

Hey Greg, I'm applying your techniques to model time series data and I'd like to ask when we call tuner.search() and pass it the training data and validation data, does it randomly shuffle the data? You probably already understand why I'm asking thank you.

And if it does, how can we preserve the time series ordering?

gamuchiraindawana
Автор

Very nice.
How can I extract the best parameters, for each layer. For example : how many filters in each cnn layer. What is chosen activation function, what is the size of the filters ? Etc.

Basicly get the structure of the model.

Thanks
Eran

eranfeit
Автор

How could I implement this with CNN? I'm working with my own dataset adn it seems like the keras tuners don't like the tf.data.Datasets yet. They're still expecting (x_train, y_train), (x_test, y_test). Is my thinking correct there? Essentially I'm loading my data using and would like to feed this into the tune.

How could I split my own data in (x_train, y_train), (x_test, y_test)?

alberro
Автор

is it possible i use this menthod lstm and xlstm

hassanayaz
Автор

Hi, I want to know how can I get the name of AF and other info of the best model. for example for the best mode, which AF used

leiladarabi
Автор

If I want to add k-fold cross validation during hyperparameter tuning, do I add the tuner.search() (i.e. simply replacing model.fit() command with this) inside my loop for k-fold ?
Thanks for the great vid!

bencekato
Автор

This was a fantastic tutorial. I am curious to know if I could do the same with signal data using a 1D CNN Model? If so, how would the input shape differ in that scenario since I am not working with photo data?

AbuZaynab
Автор

Hi Greg!
I hope you are doing well, wanted to ask you if there is a way to use keras tunner with an stacked model, which comes from two models.
And if there is which is the best way to do it

adaravivianagomezcruz
Автор

Hi Greg!
thanks for create this tutorial, it help me so much.
but I'm currious about the possibility of implementing the keras-tuner, after i augmenting the number of image samples, by ImageDataGenerator. is it possible??

royranggarofiulazmi
Автор

Is there a way to get best number of layers also?

robottalks
Автор

your coding tutorial are way better than that review content

commonsense
Автор

Hey Greg, do you prefer more tensorflow or pytorch and why? Sorry for bothering you

asgardro