Using K-Fold Cross Validation with Keras (5.2)

preview_player
Показать описание
K-Fold cross validation is an important technique for deep learning. This video introduces regular k-fold cross validation for regression, as well as stratified k-fold for classification. Cross-validation can be used for a wide array of tasks, such as error estimation, early stopping, and hyper-parameter optimization.

Code for This Video:

Follow Me/Subscribe:

Рекомендации по теме
Комментарии
Автор

This is a great video. I am just finishing up my degree in computer engineering and was stuck on an assignment until I watched this video. It has great production quality and you provide enough information to the user for them to understand what's actually going on. Thanks for this!

thatguycalledaustin
Автор

Thank you! This was explained so well and clear. I was looking for a solution all over the internet and finally found one! lets get coding...

BlackBlingGirl
Автор

Thanks so much for the video, I didn't know about this, I love the way scikit learn integrates with other frameworks, keep it up!

I wish I could took this course in person :)!

MrCutter
Автор

Hi, thank you for your amazing video series!
One thing I'm wondering about is the fact that you are using your validation data as the test data. I read that the validation data doesn't influence the model in its learning phase. I just want to ask whether it is fine and formally correct to use your validation data as the test data?

felixmuller
Автор

In the first part of your study, you standardized the income, aspect, save_rate and subscriptions columns. Why did you standardize only these columns?

kaanakdik
Автор

Crack! Like a good teacher you teached me the intuition and answered the question.. Good i run my k-fold algorithm but now what?, what kind of decissions could i take with this?. Really thanks ! Because on internet people talk mainly about the algorithm but no about the decisions

klausrichter
Автор

Hi, in the last example, which model (out of the five models) is used for the holdout data?

seyedmostafahallaji
Автор

How to compare or draw the loss and val_loss in this case?

alisaghi
Автор

Hey Jeff, it would be awesome if you could show how to go about combining the models into a single one rather than voting

zebcode
Автор

how do you do the normalization with the holdout- is the X_main used to find the scaling and offset and then that is applied to the holdout?

joshuamills
Автор

Hey Jeff! I hope I get a reply as early as possible because I'm working on a deep learning project. So, my dataset is actually image frames (residing in a separate folder) with corresponding XML annotation files in another folder. I want to train a model for object detection. Can I use K fold cross-validation in such kind of dataset? If yes, how? If not, then what would be the best approach?

GeekGenius
Автор

Will the kfoldstratify reduce the problem of imbalanced dataset in case of multi class classification ? Or is it required to perform SMOTE apart from kfoldstratify to over the program of imbalanced dataset ?

pradeepgb