Deep Learning(CS7015): Lec 4.1 Feedforward Neural Networks (a.k.a multilayered network of neurons)

preview_player
Показать описание
lec04mod01
Рекомендации по теме
Комментарии
Автор

@15:41 "with great complexity comes....great power" with great power comes great responsibility. with great responsibility comes great expectations. with great expectations comes great sacrifice. with great sacrifice comes great reward.
And thus... the objective function was maximized

RahulMadhavan
Автор

Shouldn't W_L at 6:31 be 'kxn' and not the other way around?

syedhasany
Автор

In a = b+w*h formula either w should be transposed or w size should be (no.of outputs by no.of inputs). only then the matrix multiplication w*h happens as expected.

jagadeeshkumarm
Автор

I think the objective loss function (yi_hat-yi)^2 is correct. It minimizes the error for all the samples while training which are i = 1 to N. What you did was write the error function in granularly. bith are needed.

BhuShu
Автор

There is a slight mistake in the formula ai = bi + W(i)`*h(i-1)

It makes sense when we see which weight wi is multiplied by which xi

ashiqhussainkumar
Автор

Can anyone plz explain the last error.
What does summation over i instances mean?

mlofficial
Автор

it will be min(i/k(fun)) not min(i/n(fun))

anshrai
Автор

Find if following is a Linearly Separable Problem or not.

        ((¬A OR B) AND 1) OR 0

Also create a Neural Network for given equation with a
suitable set of weights.

MuhammadWaseem-vbqe