filmov
tv
A Gentle Introduction To Math Behind Neural Networks and Deep Learning (nested composite function)

Показать описание
In this video, we explain the basic mathematics behind neural networks and deep learning through a simple classification example. We start by defining a threshold logic unit or TLU, also known as neuron. We then discuss the concept of a weight sum or linear combination followed by applying a nonlinear activation function, such as sigmoid or ReLU. We then put together a few units to show how we can express their relationships in the form matrix/vector multiplications. The main idea is that we can extract meaningful features from the input to develop better classification models. Therefore, we see that we have a nested or composite function to represent neural networks and deep learning models.
#neuralnetworks #deeplearning #mathematics
#neuralnetworks #deeplearning #mathematics