Machine Learning 1: Lesson 8

preview_player
Показать описание
Today we start the second half of the course - we're moving from decision tree based approaches like random forests, to gradient descent based approaches like deep learning.

Our first step in this journey will be to use Pytorch to help us implement logistic regression from scratch. We'll be building a model for the classic MNIST dataset of hand-written digits.
Рекомендации по теме
Комментарии
Автор

at 1:23:16 When you defined get_weights, you only had one parameter, but when you used it in the LogReg, you passed in 2 parameters. I dont really understand that

ihgnmah
Автор

What does Jeremy mean what he says, "You can only do around log2(n) decisions, so if there's a time series it needs to fit to that takes four steps, then suddenly there's not many decisions for it left to make" (1:33) How does he reach this conclusion?

ho
Автор

MNIST link dead :( deeplearning.net unavailable....

ashpats
Автор

18:19 that is not normalization, it's standardization. Normalization is scaling the data to the minimum and maximum.

bijan
Автор

What do I need to know to build a deep learning framework? please tell me the courses and books. please answer me

aidenstill