Regularization in a Neural Network explained

preview_player
Показать описание
In this video, we explain the concept of regularization in an artificial neural network and also show how to specify regularization in code with Keras.

🕒🦎 VIDEO SECTIONS 🦎🕒

00:30 Help deeplizard add video timestamps - See example in the description
05:25 Collective Intelligence and the DEEPLIZARD HIVEMIND

💥🦎 DEEPLIZARD COMMUNITY RESOURCES 🦎💥

👋 Hey, we're Chris and Mandy, the creators of deeplizard!
👀 CHECK OUT OUR VLOG:

👉 Check out the blog post and other resources for this video:

💻 DOWNLOAD ACCESS TO CODE FILES
🤖 Available for members of the deeplizard hivemind:

🧠 Support collective intelligence, join the deeplizard hivemind:

🤜 Support collective intelligence, create a quiz question for this video:

🚀 Boost collective intelligence by sharing this video on social media!

❤️🦎 Special thanks to the following polymaths of the deeplizard hivemind:
Tammy
Prash
Zach Wimpee

👀 Follow deeplizard:

🎓 Deep Learning with deeplizard:

🎓 Other Courses:

🛒 Check out products deeplizard recommends on Amazon:

📕 Get a FREE 30-day Audible trial and 2 FREE audio books using deeplizard's link:

🎵 deeplizard uses music by Kevin MacLeod

❤️ Please use the knowledge gained from deeplizard content for good, not evil.
Рекомендации по теме
Комментарии
Автор

0:14 intro
1:09 L2 regularization
3:00 why regularization
4:15 Keras

CosmiaNebula
Автор

Absolutely brilliant! DeepLizard is now among my favorites for Deep Learning architecture and tuning concepts. Thank you for making this video.

vtrandal
Автор

Thank you very much for this video! It gave me a good grasp of what regularization is, what it's for, and how it's implemented in Keras!

tymothylim
Автор

Simple, direct, informative and helpful! Thank you a million!

mohaliyet
Автор

Watching or listening to your video is really a good way for me to learn new stuff comfortably and quickly. Accurate explanations, useful example, implemented codes, everything is awesome. Thank you!

Joe-qpow
Автор

thank you very much for this clear and helpful explanation.
Words fail to express my gratitude.

qusayhamad
Автор

I really love this channel. Thank you so much. Great work.

abderrahmanedjerourou
Автор

your videos are beyond amazing with to the point and lucid explanation. Thanks

adwaitnaik
Автор

I spotted a slight issue in the article for this video.

The article ends by saying "We are ready now to look at the concept of a learning rate. I'll see ya there.", but the next article in line (by clicking the "Next" button) is for Batch Size.

I really enjoy your courses so far, by the way. I've stopped and started a few times with studying ML in the past, but this has been a pleasure to go through.

fritz-c
Автор

Thank's again Like the whole "deeplizard" serie. Explain thing's I want to understand more, I feel often like a clown when I try and test and play with some basic of Neural Network, but you help me to understand more.

Nissearne
Автор

Heard your voice in Hitman 2 Silent Assasin game.
"47 this is your agent Diana from
By the way,
Very helpful content ♥️ !👍

AvinashSingh-bkkg
Автор

Hey, I recommended these videos to everybody interested to learn about Deep Learning. I love that your videos don't exceed 7minutes, has a good explanation, good implementation in keras--perfect

The only thing that would make it even better is if you could add links to documents that you mentioned in your video: for example: a link to all regularization functons by keras. It would be a complete learning tool then.
Keep making these videos foreverr!

smritisings
Автор

you've really good voice..and really informative.

shivamjuna
Автор

amazing how you break down the topics to such short, simple and well understandable videos! Great job

vladiklass
Автор

Great, I think i've understood it now! Informative and to the point, thank you!

michaelmuller
Автор

Really helped me to understand the concept Thanks

sivakumar-homw
Автор

I like your lectures... keep it up, really beneficial!

kkhanhy
Автор

I had been looking for this for so long!

haseebshah
Автор

again another fantastic video :) may I suggest that for the Jupyter noteblook portions (for here and other videos) especially if there are only a few cells, can you please magnify the images? maybe to even 3x bigger? if I'm watching on my phone or tablet, that would help immensely :) Thanks alot :D

richarda
Автор

okay very nice video but: regularization penelizeses to much network complexity ... this is done by adding a term to the loss function that penelizes large weights -> but I dont get the connection, why is the model complexity incresing if the weights get bigger ?
3:50 there you explained it :)
So is it realy only about decreasing weights & because of that having very little or 0-weights, which means less connenctions between the neurons OR are there more reasons to the correlation between weight "size" & model complexity ?

manuelkarner