Deriving the ReLU Function for Neural Networks

preview_player
Показать описание
In this video, we go over the derivation of the famous ReLU (rectified linear units) function used in Neural Networks.

Рекомендации по теме
Комментарии
Автор

So if x=2 is the solution 2x
When you take the derivative it's 2.

I'm confused because if it was 2 from Relu and not 2x then the derivative will be zero always

StevenSmith