6- Implementing a neural network from scratch in Python

preview_player
Показать описание
In this tutorial, I implement a neural network (Multilayer Perceptron) from scratch using Python and numpy. I focus on the network data representation and on forward propagation.

This video builds on top of the theory discussed in the previous video of the series. If you haven't watched the previous video (5- Computation in neural networks), I'll suggest you go check that out first.

Computation in neural networks:

Code:

Interested in hiring me as a consultant/freelancer?

Join The Sound Of AI Slack community:

Follow Valerio on Facebook:

Valerio's Linkedin:

Valerio's Twitter:
Рекомендации по теме
Комментарии
Автор

In past two years I have gone through literally many tutorials. But this series is the best !!! @Valerio Velardo: You are an excellent teacher .. Thank you..

sdc
Автор

Only few people are out there, One is Valerio and 2nd is Sentdex, who explain the actual things happening under the hood. Thank you. People just import keras and do BS, but Valerio explains the math behind it. Awesome

Moonwalkerrabhi
Автор

The most practical videos I have watched on machine learning

arifgur
Автор

The guy who carried me through my master courses. Best wishes my brother Valerio <3

stratosgiannilias
Автор

Bravo! Devo dire, grazie Valerio. I tuoi corsi sono fantastici.

deminan
Автор

you basically delivered the ultimate master class for learning this, this is wonderful!!
the flow of information feels so smooth and yet one feels to be integrating the concepts and the method in an easy way somehow
true gratitude!!

fabiasantcovsky
Автор

Super elegant implementation of layers, weights, and forward propagation in one dot multiplication in a loop. Enjoyed coding it and testing it. Awesome.

airesearch
Автор

With your patient and detailed explanation you earned a subscriber. Thank you for your effort.

johnnyfry
Автор

These videos are best. I bet no one is teaching this stuff so good.

HimanshuKumar-xztk
Автор

I came here from first video and keep going. So far everything is cool and ı m curious about audio parts. Keep your kindness mann. Thanks a lot again for your time

orhanors
Автор

@Valerio Velardo You are a great teacher, Thank you for everything I have learned from you

baharedavoodabadi
Автор

You're really a cool teacher, Valerio!

AliAsaadIM
Автор

Simply Excellent. Congratulations. I hope you continue to help us. Thank you a lot.

carlosmoreno
Автор

Great tutorial, well paced and easy to follow. Subscribed and liked!

lh
Автор

your work is excellent, please make more videos

muhammadawaisalamkhan
Автор

At 14:20 why are you using the sigmoid activation function instead of ReLU?

hackercop
Автор

My feeling is similar to someone who's blind seeing the world for the first time. #ThankYou

NaveenKumarasinghe
Автор

Great job man, Plz make tutorial on Wavenet and teach it from scratch

smilebig
Автор

Very nice, thanks ! Correct me if I'm wrong but in this implementation, we assume examples/input neurons are 1 dimensional right ? otherwise, if you have inputs with shape (100, 3) for example, and then 2 neurons in the first input layers, you'll have the first w with shape (100, 2) but it will not match the shape of the inputs when we will want to do matrix multiplication right ?

geogeo
Автор

I have a question there are three values in weights and also three values in activations so how can it return only two values?

sahil__d