How to build a Simple Neural Network in Python (One Layer) Part I

preview_player
Показать описание
Example is from the book "Machine Learning for Finance" by Jannes Klaas.
In this video we are building a simple one layer Neural Network from scratch in Python. In specific we are setting up the input layer and initialize random weights and feed this data to the activation function (sigmoid).
The output is then compared with the actual output (y) and measured with the binary cross entropy loss.
If you found this interesting I will continue with optimizing the network using Gradient Descent and Parameter update using Backpropagation and provide details on how to proceed.

Link to Book (Not affiliated btw)

Mentioned articles:
Bias and weights

Cross Entropy Loss:

00:00 - 01:36 Introduction / Resources
01:36 - 02:31 Input Layer and output (y)
02:31 - 04:20 Initialize random weights and bias
04:20 - 04:50 Getting z (Summation + bias)
04:50 - 05:53 Sigmoid function and getting A (Output of layer)
05:53 - 06:16 Comparing layer output with y
06:16 - 10:20 Binary Cross Entropy Loss
10:20 - 10:45 What needs to be done / outlook

#Python #NeuralNetwork
Рекомендации по теме
Комментарии
Автор

Been studying finance, and seeing the math I've been learning all this time applied to neural networks is awesome. Thanks for this clear introduction

crunchybaguette
Автор

Very good explanation with examples and information. Waiting for part II.

sarwar
Автор

Parabéns, extremamente didático. Aguardando atentamente a continuação.👏👏

cassioandrade
Автор

We are looking forward for your videos, especially related with neural networks

bcode
Автор

Thank you for bringing this topic up ☺️☺️

Ghulinzer
Автор

Hi, when will there be a part II of this video? I was wondering how this could be used train a neural network to develop a trading strategy!

stijneeltink
Автор

Sir Highly request, write a code Stock pattern screener like head and shoulders etc.

alluram
Автор

Please try to code a Nadraya- Watson Envelope on a real data from binance

rafalsza