Machine Learning Lecture 11 'Logistic Regression' -Cornell CS4780 SP17

preview_player
Показать описание
Рекомендации по теме
Комментарии
Автор

Am I the only one who raises his hands from my home whenever he says raise your hands? :P

vatsan
Автор

So Many professor have knowledge
But only few have enthusiasm while teaching

Enem_Verse
Автор

You literally saved my comprehension of Statistical Learning, thanks!

kodjigarpp
Автор

Awesome lectures !! glad to have bumped into one of them; after that spending time on the entire series felt worthwhile.

sandeepreddy
Автор

A fantastic lecture. Thank you professor.

smallstone
Автор

Thanks for posting all these lectures Dr. Weinberger. Should make Siraj Rival aware of their availability!

rolandheinze
Автор

I love the way he gets so excited when he says TADA! xD

vatsan
Автор

This video lectures are great ! completed 12 in 2 days ! I find it more intuitive than Andrew Ng one . Also Prof have you ever recorded lectures on unsupervised learning ? would love to watch those since those are missing from this series.

dhrumilshah
Автор

This is great stuff. It's just funny that those are motorized chalkboard instead of dry erase boards.

YulinZhang
Автор

It could also be nice to see a dataset correctly classified by Naive Bayes and that if Logistic Regression optimizes the hyperplane even further.

ugurkap
Автор

@KilianWeinberger Unfortunately online viewers don't have access to the course homework but I think your claim in 20:31 is only valid if across each dimension, data from classes +1 and -1 happen to come from the same variance Gaussian distributions. Otherwise, you would need quadratic terms too.

mhsnk
Автор

Weinberger is one of the best machine learning lecturer

xiaoweidu
Автор

Interesting to learn the link between naive Bayes and logistic regression. Thank you! For the spam email example with very high dimension feature, logistic regression won’t work right.

taketaxisky
Автор

Hi Kilian, The flow of your lectures are awesome. How you build upon the concepts is amazing.
Do you have Matlab codes shared publically? Really cool demos

nrupatunga
Автор

It was nice learning about the connection between naive bayes and logistic regression. However, at the moment, I am only able to see the connection between GaussianNB and Logistic regression. Is there some way to logistic regression if the features are not real-valued?

jachawkvr
Автор

I couldn't prove that Naive Bayes for continuous variable is a linear classifier except for the case where I assumed the variance doesn't vary across labels (spam, ham as an example) of y and only varies across input variables x_alpha. Was anyone able to prove it?

sudhanshuvashisht
Автор

Since we are using the same form of distribution for P(Y|X) for NB and Logistic Regression, are we still making the same underlying assumption of conditional independence of Xi|Y in case of Logistic Regression. Or does directly estimating the parameters of P(Y|X) means that we are relaxing that assumption?

sinhavaibhav
Автор

can we say that Gaussian Naive Bayes is logistic regression In the case of continuous features?

sekfook
Автор

Hi Kilian, As Naive Bayes comes up with hyperplane that separates two distributions rather two datasets, can the same statement hold good even if the input data set is highly imbalanced..? I mean with out balancing can we still proceed..?

satyagv
Автор

So what's best to find the P of logistic regression? MAP or MLE?

JoaoVitorBRgomes