Logistic Regression Details Pt 2: Maximum Likelihood

preview_player
Показать описание
This video follows from where we left off in Part 1 in this series on the details of Logistic Regression. This time we're going to talk about how the squiggly line is optimized to best fit the data.

NOTE: This StatQuest assumes that you are already familiar with Part 1 in this series, Logistic Regression Details Pt1: Coefficients:

For a complete index of all the StatQuest videos, check out:

If you'd like to support StatQuest, please consider...

Buying The StatQuest Illustrated Guide to Machine Learning!!!

...or...

...a cool StatQuest t-shirt or sweatshirt:

...buying one or two of my songs (or go large and get a whole album!)

...or just donating to StatQuest!

Lastly, if you want to keep up with me as I research and create new StatQuests, follow me on twitter:

#statquest #logistic #MLE
Рекомендации по теме
Комментарии
Автор

NOTE: In statistics, machine learning and most programming languages, the default base for the log() function is 'e'. In other words, when I write, "log()", I mean "natural log()", or "ln()". Thus, the log to the base 'e' of 2.717 = 1.

statquest
Автор

Dear Josh, I want you to know that there are many of us who are so thankful to have you.

balamuralikannaiyan
Автор

In the past one year of my MBA, I have been taught concepts of logistic regression by Math PhDs and Analytics Gurus, but no one could beat the simplicity and elegance of your explanation.

shadwal
Автор

The simplicity of your work is what is truly needed in education. Too often are professors and tutors trying to teach complex stuff in a complex way to make themselves seem smarter. Too many ignore the fact that learning complex things in a simplified manner is much more beneficial. Your work does this perfectly.

Dayman
Автор

This explanation is so good that I feel kinda guilty for having access to it. There is no doubt that if people in the past had access to this, then their lives would've been a lot easier. I feel like a spoiled brat. This explanation is too good for this world.

notnilc
Автор

Easily the most intuitive and detailed explanation of logistic regression + max likelihood on the web. period.

nampai
Автор

This is where our tuition fees should have gone to.

Lj-znej
Автор

Thank you Josh, now we understand, during classroom days classmates don't do better because they are necessarily smarter but because they get exceptional teachers like you. Thank you for democratising statistics and machine learning and bridging the gap, more power to you.

AnuragHalderEcon
Автор

I teach graduate level operations research courses which require some understanding of probability and statistics. My students really had limited understanding of some fundamental knowledge even if they took some related courses before. I found your videos organized and concise. I will recommend your channel to my future students. Thank you, Josh!

Shawn-vwgk
Автор

Dear Josh, your videos are amazing and I would have never passed my qualification exam without them. Also, this is the first time I actually understood how MLE works. Thank you so much!

EvgeniaOlimp
Автор

Hi Josh.. I hope you know how you are changing people's life directly.. a lot of people are earning very good salary bz of your quality content.. if one person gets a job, his entirely family is getting benefited bz of him.. so, you are not just helping one single person but thousands of families.. May the almighty bless you and keep you happy, healthy and wise..

riyaz
Автор

This is a true blessing for data science students 🎉 love u 3000 😊

tonyleung
Автор

Hi Josh, you are single-handedly carrying me through my Masters program with your videos. I was seriously considering dropping out earlier this year cause I was having a lot of difficulty understanding anything, but a lot of things are starting to click now thanks to you. Your vids are a godsend to students everywhere.

pereeia
Автор

This is the first time I clearly understood Maximum Likelihood and Logistic Regression. Thanks for your videos.

harishjulapalli
Автор

I must say you are a magician. You have the tricks to communicate and deliver just what people want. We want more like you. Thank you so much in shaping the world.

TheAbhiporwal
Автор

The best explanation ever on logistic regression which includes every details. Thanks a lot Josh, i adore you so much!

atrayeedasgupta
Автор

This dude is a saint. These videos really condense the ideas to some easy to follow steps.

lambuth
Автор

So far, the best explanation of maximum likelihood estimation on Youtube. Log odds of me being better at math would be significantly high had you been my math teacher at highschool. Thanks Josh.

ahmetjeyhunov
Автор

I am student of data science, when i see this logistic algorithm calculation, i am scared and think that i could survive in this field or not, But after seeing your content of this algorithm i gonna play with this .Thankyou so much sir for this valuable content

rishabsaini
Автор

Words cannot express how grateful I am for this amazing explanation.

k.akenou