Coding A Neural Network FROM SCRATCH! (Part 2)

preview_player
Показать описание


Рекомендации по теме
Комментарии
Автор

I smell an underrated channel.
You are literally the savior of my science fair project, thank you so much.

FlatterBaker
Автор

Hey! I'm wondering when Part 3 is coming out. Can't wait to see it!

willysmb-bovn
Автор

The Brain function can be heavily simplified

You can put the two edge cases outside of the loop, caling layers[0], and layers[layers.length-1], and having the for loop start with i=1, and run while 1 < layers.length -1

DarthAnimal
Автор

I've seen a lot of videos about Neural Networks and yours is the one that explains it in an understandable manner (Or maybe the 10th time is the charm)
I'm curious to see the next one

PatrykPonichtera
Автор

A great series. Not only for content, but well edited too. Cheers John.

drewdowsett
Автор

Great video. Thanks for the effort. I'm looking forward to see the part 3. Cheers,

mehmatrix
Автор

I was following along in python. Here's the code if anyone wants it. I didn't test it though because I don't really know how to use it. Tutorial was too short ):

networkShape = [2, 4, 4, 2]

class Layer(object):
def __init__(self, n_inputs, n_nodes):
self.n_nodes = n_nodes
self.n_inputs = n_inputs

self.weightsArray = [n_nodes, n_inputs]
self.biasesArray = [n_nodes]
self.nodeArray = [n_nodes]

def forward(self, inputsArray):
self.nodeArray = [self.n_nodes]

for i in range(self.n_nodes):
# Sum of the weights times inputs
for j in range(self.n_inputs):
self.nodeArray[i] += self.weightsArray[i, j] * inputsArray
# Add the bias
self.nodesArray[i] += self.biasesArray[i]

def activation(self):
for i in range(self.n_nodes):
if self.nodeArray[i] < 0:
self.nodeArray[i] = 0

def awake():
global layers

# layers = Layer(len(networkShape) - 1)
layers = []
for i in range(len(networkShape) - 1):
# layers[i] = Layer(networkShape[i], networkShape[i + 1])
layers.append(Layer(networkShape[i], networkShape[i + 1]))

def brain(inputs):
for i in range(len(layers)):
if i == 0:
layers[i].forward(inputs)
layers[i].activation()
elif i == len(layers) - 1:
layers[i].forward(layers[i - 1].nodeArray)
else:
layers[i].forward(layers[i - 1].nodeArray)
layers[i].activation()

return layers[-1].nodeArray

voil
Автор

Hopefully you'll finish this eventually! I enjoyed the last two videos.

patricksturgill
Автор

Nice video! In general I'd say a neural network is still a black box even if you built it and know the values of all the nodes, weights, biases and layers.

slarcraft
Автор

thank you so much! this is exactly what i need for my uni project

bencebob
Автор

Great video ! I'm really looking forward to see how the network will be trained

wyrdokward
Автор

isn't the 'layer' in layer[ i ] = new Layer( networkShape[ i ], networkShape[ i + 1 ] ); supposed to be 'layers' ?

DanielYong-ok
Автор

That was an excellent, practical video on neural networks! As someone just beginning to dig into this subject, I love it!

Also, clean and neat code. Enjoyable to read (although I'm not a fan of nesting classes).

gustavosalmeron
Автор

Hey! I wonder when episode 3 will come out?

Bloodbone
Автор

best video ive ever seen not gonna lie

t.p.
Автор

O man thank you! Such a gem content out here :) I'm a Swift Dev the code is not hard to grasp

TheZazatv
Автор

:') Episode 3 where are you... this is the new Half Life 3 for me. Am I wrong in thinking the weights and biases were never given values here? should those be made in this class too?

erinleighlynch
Автор

I wonder that the shape of network . I mean how many hidden layers and nodes should we use for each sample .
And also wondering about third part .

aesvarash
Автор

i need that next video i have no idea what im doing :(

i have this code and i think i understood how it works after starring at it for an Eternity BUT how can i make use of it now....

thimodemoura
Автор

if the activation function always happen right after forward. why not just combine them?

adirmugrabi