Introduction to PyTorch and Deep Learning II - Deep Neural Networks & Training in PyTorch - Lecture

preview_player
Показать описание
Welcome to the Introduction to PyTorch and Deep Learning online class, which will teach you some basic topics in machine learning.

The lecture begins with an introduction to deep feedforward networks and their architecture, including the role of input, output, and hidden layers. We then discuss the XOR problem, which involves learning a function that outputs a 1 if either of two binary inputs is 1, but outputs a 0 if both inputs are 0 or both inputs are 1. We show how deep feedforward networks can be used to solve the XOR problem.

Next, we dive into the backpropagation algorithm and gradient descent, which are key components of training deep feedforward networks. We cover the basic concepts of backpropagation, including how it computes gradients and how it is used to update the weights in the network. We also discuss gradient descent, which is the optimization algorithm used to find the weights that minimize the error between the network's output and the desired output.

Finally, we apply the backpropagation and gradient descent algorithms to the XOR problem, showing how deep feedforward networks can learn to solve the problem through iterative training.

0:00 Introduction and Outline
2:32 Deep Feedforward Networks
20:44 The XOR Problem with Feedforward Networks
28:09 Backpropagation & Gradient Descent

*Materials*

*Creators*
Рекомендации по теме
visit shbcf.ru