Machine Learning Lecture 31 'Random Forests / Bagging' -Cornell CS4780 SP17

preview_player
Показать описание
Lecture Notes:

Рекомендации по теме
Комментарии
Автор

Thank you Dear Professor for making these available to us. Not only do you make it interesting, but you have a way of explaining at a deep level, making the concepts so much clearer for us to grasp.

sameerkhnl
Автор

It goes without saying that you are a great teacher. I also like how you always tell the name of the people who invent these algorithms! :) Makes the class a lot more engaging for me

vatsan
Автор

Wow! That was the most astonishing thing I could ever think I would find on the internet about machine learning. Thank you professor for sharing your deep insight.

rezasadeghi
Автор

You are such an awesome teacher. I laughed and learned simultaneously. Thanks.

puneetjain
Автор

This is the best lecture series of all ML lectures.

shrishtrivedi
Автор

I don't think I have ever wrote any comments on youtube. Thats my first. Actually just wanted to thank you for sharing these amazing lectures, and for your wonderful teaching mythology and explanations

abdelmoniemdarwish
Автор

“Boosting brings me to tears sometimes.” “You gotta eat a lot of fruit before the next lecture.” I love you.

thirstyfrenchie
Автор

Hats off Sir, You are truly a great Teacher.

sandipsamaddar
Автор

17:54 volunteers 🤣
Thanks prof for the fun and interesting lecture, got to revise these fundamentals quickly 🙏

newbie
Автор

beautiful lecture, greetings from Brazil professor

BrunoSouza-wyet
Автор

Prof. Weinberger,
Thank you for posting your course online, it has been an extremely helpful and an extremely enjoyable learning experience.
Will you post those(or future) recitations online in future? As they would add a lot of value by supplementing the lectures, thereby helping online learners like myself get a better understanding.
Thank You.

TheCrmagic
Автор

Your lecture is awesome, Sir. It also brings me to tears 38:02

baohoquoc
Автор

In the definition of out-of-bag error, what we take usually as a loss function while implementing classification via random forest?

aragasparyan
Автор

34:46, you say bias is not a function of H (your hypothesis), it is a function of the average classifier, that's why your bias is low. Could you also explain it is because when you sum uncorrelated errors, to find the mean classifier, they sum up to zero?

JoaoVitorBRgomes
Автор

Hi Professor Killian, You said the estimator in random forest is an unbiased estimator[around 27:00 minute], I am not able to understand why its unbiased, can you explain a bit about the unbaisness. Thanks in advance

rakeshkumarmallik
Автор

Prof. Weinberger,
When we use bootstrapping, it will duplicate records. Wouldn't that be a problem while training our model? It's like giving more weights to some records. It might lead to biasness.
Also, need your opinion in general on whether we should remove duplicate records in preprocessing or not because of IID assumption which I believe holds for all machine learning algorithm?

abhisheksingla
Автор

Dear prof. Weinberger,
first of all, thank you for publishing your lectures. They are awesome!
I would like to ask you in which way random forests can be used to perform feature selection since each tree composing the forest does not consider all the features; can you explain in which way the feature are evaluated by looking at the trees?
Thank you in advance.
Best regards,

NG

Автор

Thanks for this great lecture, very helpful

omerfarukyasar
Автор

Thank you for your great lessons prof ! ( From Morocco)

moumniable
Автор

Excellent lecture, thank you very much for uploading and sharing your knowledge.

doloressanchez