Which Activation Function Should I Use?

preview_player
Показать описание
All neural networks use activation functions, but the reasons behind using them are never clear! Let's discuss what activation functions are, when they should be used, and what the difference between them is.

Sample code from this video:

Please subscribe! And like. And comment. That's what keeps me going.

More Learning resources:

Join us in the Wizards Slack channel:

And please support me on Patreon:
Follow me:
Signup for my newsletter for exciting updates in the field of AI:
Рекомендации по теме
Комментарии
Автор

Thanks, my biological neural network now has learned how to choose activation functions!

Skythedragon
Автор

From experience I'd recommend in order, ELU (exponential linear units) >> leaky ReLU > ReLU > tanh, sigmoid. I agree that you basically never have an excuse to use tanh or sigmoid.

StephenRoseDuo
Автор

Really enjoyed the video as you add subtle humor in between.

rafiakhan
Автор

just watched your speech @TNW Conference 2017, I am really happy that you are growing every day, You are my motivation and my idol. proud of you love you

hussain
Автор

I really like your videos as they strike the very sweet spot between being concise and precise!

quant-trader-
Автор

I love you man, 4 months passed and my stupid prof. could not explain it as you did, not even partially. keep up the good work.
Thanks a lot

BOSS-bkjx
Автор

Wow, man, this is a seriously amazing video. Very entertaining and informative at the same time. Keep up great work! I'm now watching all your other videos :)

calinicated
Автор

Dude! DUUUDE! You are AMAZING! I've read multiple papers already, but now the stuff are really making sense to me!

pouyan
Автор

hey Siraj- just wanted to say thanks again. Apparently you got carried away and got busted being sneaky w crediting. I still respect your hustle and hunger. I think your means justify your ends- if you didn't make the moves that you did to prop up the image etc, I probably wouldn't have found you and your resources. At the end of the day, you are in fact legit bc you really bridge the gap of 1) knowing what ur talking about (i hope) 2) empathizing w someone learning this stuff (needed to break it down) 3) raising awareness about low hanging fruit that ppl outside the realm might not be aware of. Thank you again!!!!

captainwalter
Автор

I gained a lot of understanding and got that "click" moment after you explained linear vs non linearity. Thanks man. Keep up w/ the dank memes. My dream is that some day, I'd see a collab video between you, Dan Shiffman, and 3Blue1Brown. Love lots from Philippines!

grainfrizz
Автор

Excellent and entertaining at a high level of entropy reduction. A fan.

supremehype
Автор

this guy needs more subs. Finally a good explanation. Thanks man!

gydo
Автор

Amazing video! THank you! I've never heard of neural networks until I started my internship. This is really fascinating.

drhf
Автор

By far the best videos of Machine Learning Ive watched. Amazing work! Love the energy and Vibe!

kalreensdancevelventures
Автор

Dank memes and dank learning, both in the same video. Who would have thought. Thanks Raj!

waleedtahir
Автор

Learning more from your videos than all my college classes together!

MrJnsc
Автор

Valuable introduction to generative methods for establishement of sense in artificial intelligence. A great way of bringing things together and express in one single indescret language.

Thanks Siraj Raval, great!

CristianMargiotta
Автор

Cool. Your lecture cleared the cloud in my brain. I now have better understanding about the whole picture of the activation function.

slowcoding
Автор

Super clear & concise. Amazing simplicity. You Rock !!!

prateekraghuwanshi
Автор

Hey Siraj, here is a great trick: show us a neural net that can perform inductive reasoning! Great videos as always, keep them coming! Learning so much!

akompsupport