Why Non-linear Activation Functions (C1W3L07)

preview_player
Показать описание

Follow us:
Рекомендации по теме
Комментарии
Автор

it was like i was blind about lin activation function but now im gifted with vision :)

moeinhasani
Автор

These lectures deserve to be recognized as the bible of machine learning

anirudhsriram
Автор

As a layman clicking buttons to try get a better understanding of large language models I thought I was making some progress, then I watched this video, now I think I should go back to primary school 😢

tonyflow
Автор

Great video!!! I am still confused about why Relu works when it's properties are quite linear. I mean, I know it's a piece-wise linear function, therefore does not meet the mathematical definition of linear function. But by using Relu, the output is still just a linear combination. Perhaps some neurons don't 'contribute', but the output is still a mathematical result of a linear combination of numbers.

SantoshGupta-jnwn
Автор

I got a question - does the use of non-linear activation increase model capacity?

TheThunderSpirit
Автор

From the videos above, I got to understand that RELu is a linear function... Rest are non linear functions.. But how can consider sigmoid function as binary?? Because a binary function always give output in the form of 0 or 1 but the sigmoid function varies between -ve infinity to +infinity & touching y axis at 0.5?

chitralalawat
Автор

It's so ridiculous that a video whose speaker is a Chinese but we only got Korean subtitle!!

ttxiton
Автор

This person has a lot of knowledge, he picks one thing and then starts explaining other and other. this disease is called explainailibalible sorry

techanddroid
Автор

Your English is difficult to understand. I keep going back to figure out what you mean by some words.

aqwkpfdhtla